Problem with export File

Hi.

In our documentation export, we have the problem that a few files cannot be written due to the file name:

Error: Failed to write to file \xxx.xxx.xxx.xxx\docinsight\APP\PrintingClasses.TPrintExecuteParams. Create(TComponent,string,string,string,string,string,string,TParameterArray,TPersonnelList,TOnBeforePrint,TOnAfterPrint,Boolean,Boolean,Integer,TDruckMailClients,Variant,Variant,string,Variant,Variant,Variant,Variant,Variant,Variant,IDruckToHistExtensions).html.

Error: Could not write to the file \xxx.xxx.xxx.xxx\docinsight\APP\Paklist. AddPackage(string,Currency,Currency,Currency,Currency,Currency,Integer,Integer,Integer,Integer,Integer,Integer,Integer,Integer,Integer,Integer,Integer,Integer,Currency,String,Boolean,Variant,Integer,Integer,String,Currency,Integer,Integer,Integer,String,String,Integer,Integer,Integer).html.

Can you solve this :wink: ?

Greeting
T.Elmers

@TElmers Thanks for so long report! I’ve filed it as Could not write to the html file due to long parameter types · Issue #50 · devjetsoftware/docinsight-support · GitHub

It was caused by long signature of a overloaded method. We can solve it by:

  • using simple hash of signature for overloaded method, e.g. AddPackage-abcdef.html
  • merge all overloads into the same html file (Microsoft docs uses this solution).

which way do you prefer? do you have other options?
btw. the overall building of doc should be completed even with this error, right? (otherwise, we need to make another issue)

Both solutions are okay.
Personally, I would prefer the ‘merge into one file’. Maybe you could make this controllable via a setting (command line parameter).

The export continues with this error.
It would also be great if the export was a bit faster. In our case it runs for just under an hour. But other factors also play a role here (network, virus scanner,…)

Thanks for your input!

I’ll consider the “merge” option when introducing new modern templates, but for now, I might stick with the “hash” approach.

Regarding performance, I aim for the DocInsight CLI to be blazingly fast. I’ve noticed that Windows Defender has a significant impact on CHM output generation. Further investigation is needed for the HTML output’s performance.

Here are some plans for improvement:

  • Add profiling tools (logs/graphs) to make performance observable.
  • Options to customize directory layout and structure of topics
  • Introduce incremental compilation in the future.

By the way, have you used DocInsight 3 CLI to generate HTML output for your codebase? If so, what was the rough elapsed time? On my local machine, I ran a simple test and observed a significant performance boost.

By the way, have you tried the --dry-run option with the docinsight build command? It simulates the build process without generating any files. The elapsed time is also a useful metric for performance improvement, as it should fully utilize your machine’s capabilities. How many topics are in your project?

Unfortunately, I no longer have any comparison times from DocInsight 3.

I will start another ‘–dry-run’ today and report back.

Hi.

I have times now:
The dry run takes 5 minutes, which I think is very fast!
Then 32 minutes for the real run.
It’s clear that you can’t achieve 5 minutes with the amount of files; the network and hardware probably play a role here.

We have 1930 units in our project file (yes, we have been developing the project for a long time :wink: )

Thanks for the information! The dry-run is reasonable fast. The bottleneck might be the IO or System service like AntiVirus or File Indexing (you may take a look at the task manager when you run the build next time).
the DocInsight CLI should tell you how many html files/topics generated. that will be helpful. I used RTL to test the performance of medium/big project, I’ll expand it to more heavy source projects.

@TElmers Forgot to mention, the current IO handling has room for optimization. You may try once shrinking jobs by using --jobs N argument (e.g. -j4) to see whether there is boost of performance.

Unfortunately, in my case, this doesn’t help that much.
With ‘-j10’ two minutes.

@TElmers
with --dry-run?

I have now made the call again with ‘–dry-run’ and ‘-j4’. 8 minutes

with ‘–dry-run’ and ‘-j10’ 6 minutes

So in principle the parameter does something, but in our case the network seems to be the bottleneck.

Sorry I didn’t make it clear. I meant trying real building with -j4 to see if there is improvement. with --dry-run, the default value should be most efficient (N is number of CPUs) as the transformation process is CPU bound.

Regard network, do you use docinsight CLI to build docs and output to a remote location? are source projects also in remote?

Our source project is stored locally. Docinsight then creates the documents on the remote location.

Got it. Created the issue: Profile and optimize the process of generating documentation to a remote location · Issue #51 · devjetsoftware/docinsight-support · GitHub

p.s. It might be faster to output to local storage and then copy all of them to the remote location.

@TElmers

You may try using docinsight to build docs on local, and then use the builtin robocopy command to copy files to the remote location. (with /mt /z switches). Maybe rsync is more performant and support incremental sync.

@TElmers

Hi, did you try using local generation and then copy/sync to the remote network? Does that improve the time if you did?

Hi. Yes, I did.

With RobyCopy one hour and 19 minutes.
Directly export to the networkdrive 38Minutes

Think our virus scanner is the bootleneck. Not your Problem :wink:
THX

Thanks for the feedback. I’ll review this later. It should be obviously improved when we introduce incremental compilation.