Jump to content
Sign in to follow this  
ducks2k

Notes on CDBurnerXP Command Line (cdbxpcmd.exe)

Recommended Posts

My specific needs are to make backup data disks of a relatively small number of root folders, but the subfolders and files may be added or deleted between backups. Saving a "compilation file" allows you to re-burn an exact set of files, but will not pick up any new ones since the compilation was made. Although I have been burning CD/DVDs for years, I am not totally familiar with things like multi-session recording. or the differences between data and music recording. I'm using Windows 7 x64

I am having decent success with the Command Line interface, but there are many aspects of the documentation page that are just not detailed enough, so I've had to discover a lot through trial and error. Here are some notes - in no particular order.

1. It IS possible to organize a multi-folder burn in a single ".bat" file and execute it either via the old DOS interface command interpreter or by a Windows program creating the batch file and causing it to be executed. One of the key aspects was new to me. The caret "^" as the LAST character of a line means the line is continued on the next text line. Here is a batch file that works sensibly for me to back up 11 audio book folders - 5 by David Baldacci and 6 by Tom Clancy. I am also copying the batch file itself to the DVD so I can see what commands produced what result. (make sure there is no blank after any of the "^"s)

"c:\Program Files (x86)\CDBurnerXP\cdbxpcmd.exe" --burn-data -device:0 ^

-file:c:\Scopy\burn\burn.bat ^

-folder[\Baldacci]:\MP3Books\Baldacci ^

-folder[\Clancy]:\MP3Books\Clancy ^

-name:BookTest6 -close

2. When this bat file gets executed, the cdbxpcmd program spits out a bunch of lines as it gathers the files from the specified folders. It then gives the strange and misleading message "Starting to write disc... 3 Files, Size: 6.39 GB" . As best as I can tell it should really be "3 Folders" and the GB number seems to always roughly twice what it should be. Then comes a series of 0%, 1%... as the DVD burns and takes the appropriate amount of time, usually producing a good disk. I have not explored a lot of the failure modes, such as trying to write more than the disk can hold, etc.

3. One aspect of the "-folder" command that gives me a lot of trouble is the "[dstpath]" specification. The command "-folder[\Clancy]:\MP3Books\Clancy" creates a root folder on the DVD of "\Clancy" and places each of the subfolders of "\MP3Books\Clancy" into it. This is pretty much what I would expect, but there are several variations that make little sense.

3a. If I try the perfectly legal "-folder[\XXX]:\MP3Books\Baldacci -folder[\XXX]:\MP3Books\Clancy" I will get the Baldacci subfolders placed in \XXX , but not the Clancy folders which just get ignored. I'm not sure if this is a bug or a feature. It would seem logical to me that the second input folder can be sensibly appended to the "\XXX" output folder, but I may not be visualizing the internal workings of the program correctly.

3b. The "[dstpath]" specification is effectively limited to a single root folder on the dst drive. The sensible specification "-folder[\MP3Books\Baldacci]:\MP3Books\Baldacci" seems to be an acceptable syntax, but it produces a folder on the DVD of "\MP3Books_Baldacci#0EA0" . If I visualize the operation as within the drag/drop version of the program, this is not something that I could sensibly do with a mouse. The files themselves seem to be OK, just the odd folder naming. This may not be a bug, but certainly needs to be documented. As "feature" it would seem like I could create the "_" folder name correctly within the square brackets, so I suspect that the existing situation is buggish - the result of a specification that is syntactically correct, but not intended to be handled by the program

4. The "--" commands, like "--erase" are 1 allowed per execution. This is almost obvious, but not quite.

4a. It is not clear if any of the "-" options are executed in any particular order, or whether it matters.

5. I have not checked/verified many of the "-" options, but some, like "-dao" may not work in the way I'm imagining, some cause program crashes. -- probably bad combinations, but the restrictions are not obvious.

6. DVD capacity. This is not strictly a command line oriented problem, but figuring out what will burn successfully to disk is tricky. There must be some website that details the explanation, but I haven't located it yet. Audio books are a relatively small number of files per gigabyte of space needed. I'm working from the total of the Windows-reported EOFs, but there is file block size overhead, data disk overhead and simply the definition of conversion from bytes to gigabytes -- all to map against the stated 4.7 Gbytes of space listed on the DVD box. As far as I can tell so far, CDBurnerXP will fail if the requested size is too large, but does not appear to have written anything to the disk at that point, so the blank can be successfully reused with a smaller request. So far, my largest successful burn is 4,541,128,900 bytes (relatively small number of trials -- a burn is my smallest failed burn is 4,664,909,858). All my tests are a single burn with closing the disk at the end.

Any thoughts/clarifications on the above points would be welcome.

Share this post


Link to post
Share on other sites

1. Did I say that it's not possible at some point?

2. "GB number seems to always roughly twice what it should be"

Bug in an older version of CDBurnerXP. Should be already fixed.

2. it should really be "3 Folders"

Seems like it only counts root items currently, and considers folders and files the same. Will adjust that for the next version.

3a, 3b: This does ineed make little sense to me. Will fix that for the next version.

5. I guess it's obvious that not both -dao and -sao can be specified. Other than that, there are no restrictions. Whatever does not make sense, will be ignored. There may ineed be a problem with the -dao option, this should work properly in the next version though.

6. I can't give you docs on the DVD file systems, and it would be rather impractical if I did and you had to figure out how many files fit on your disc. For this situation, a programmatic solution would be more appropriate, but what would you expect?

Share this post


Link to post
Share on other sites

Hi Flo - I'm sorry if you interpreted the tone of my notes as critical. I should have prefaced them with "WOW! - it is amazingly easy to do significant data burns with a remarkably simple set of commands." If any criticism was implied, it was just that all this power can't be adequately covered in one page of help documentation. Fortunately, blank disks are cheap for experimentation. I tried to address the areas where I had the hardest time making full use of your excellent program.

1. Batch files. My point was not that you had implied that it couldn't be done, but that it was not obvious (to me) that it would be so easy to do. I was not aware that batch file lines could be easily continued in a text file (and I have not explored any actual limits such as number of "-folder" commands that can be handled or "gotchas" such as a 256 character overall limit, or 1024, 4096 or some internal buffer size. At the moment, I have not encountered any limits in my testing. I do all my programming in the Delphi language and it is very easy to build bat files on the fly and execute them - this makes an easy interface to your excellent software. I was afraid that I would have to replicate the xml file logic of the saved burn files which would have been MUCH harder.

2. Gigabyte reporting twice what it should - My download was only a week or so ago, and the interactive version reports 4.3.2.2140 and both exe files show a compile date of 5/2/2010 11:43 which would seem pretty recent. I have not noticed any similar problem with the interactive version (but It would only appear in the middle of the burn stage and I might not have noticed). With cdbxpcmd being such a small exe file, I assume it must be wrapper for the same dlls that the interactive version uses, so I would be surprised if they act differently.

3. Files versus folders - Since this is the same reporting line as the gigabyte problem, it would not surprise me if the two are related.

5. My use of "-dao" was early in my testing, so errors were probably caused by something else, but my recollection is that an otherwise good burn failed if I included "-dao". I will test more and see if I can come up with a specific symptom.

6. Disc capacity. My complaint was not directed at you, but at the general problem. I have since spent several hours with Google links and do not feel much closer to full understanding. I have no need to fill a disc precisely to its limits (although I see that some other threads on this forum do care about precise burn limits). Since trying to write too much data quickly results in a failed burn, but without writing anything to the disc, all I need to do is reduce the request a little and try again. I was hoping for a simple answer, but I think the question is more complicated than it appears.

------

Again, I apologize if anything I said implies that you have failed in any way. I value my own time highly and I only bother writing about a program if I feel it is very good, but could get a little better, or if I spent a lot of time missing something obvious and I might save somebody else some time. Thank you again for an excellent program. Howard

Share this post


Link to post
Share on other sites

Hm, seems like you interpreted my rather short and direct answers as some kind of being annoyed. No - I did not perceive your questions as complaints, instead, I'm happy that you pointed out your problems in detail :)

1. Well, since my doscs are not mentioning batch files at all, I am not currently sure how to integrate your hint about multi line batch files. You may, however, suggest a section about batch files for the command line help page. Just post the text here, and I'll add it to the help file.

2. You should try version 4.3.2.2212 then.

5. You don't have to - another user reported such a problem to me, and as I said before, it will be fixed in the next version apparently.

6. Nope, there is no simple answer. The required size does not only depend on the included files' sizes, but also the used file system.

Share this post


Link to post
Share on other sites

Hi Flo - I do have to be careful - I've been known to rant on a subject and then discover that it is a clearly addressed FAQ or something....

Anyway... on point #2 the Gigabytes problem... Yes version 4.3.2.2212 does address the problem and a 1.93GB burn reports the size correctly,

but a 3.23GB burn reports as "-802512.00 KB", Looks strongly like the display formatting code or the bytecount itself passes through a 32 bit integer rather than a 64 bit integer. (been there, done that.)

Thanks - Howard

Share this post


Link to post
Share on other sites
My specific needs are to make backup data disks of a relatively small number of root folders, but the subfolders and files may be added or deleted between backups. Saving a "compilation file" allows you to re-burn an exact set of files, but will not pick up any new ones since the compilation was made.

I had the same problem. But then I realized that by right clicking in the compilation pane, the context menu starts with an Update compilation entry that does just that. It re-scans the folders and picks up the new contents. I use it regularly now to make backup copies of constantly changing folders. If this is all you really need, there is no need to do it the hard way...

Bye,

Gábor

Share this post


Link to post
Share on other sites

Well, there is a big "oops...".... Does it work from the original drag list? or does it work from the set of folders that would be found in the compilation? If the former, then it would pick up additional subfolders not present in the compilation, if the latter, it might not. One of my primary folders to back up would be "c:\D6" which is the root for all of my 400+ Delphi projects. The 400 are arranged in various subtrees, so a new project might be placed several levels down in the path. I if I drag "c:\D6" down, I will pick up all of the subfolders.

A very casual examination of the compilation file shows an explicit list of files and folders. I can visualize how deleted files can be detected easily, and that each folder could be re-scanned to add new files, but adding new subfolders is not so obvious. One reason I am somewhat doubtful is that, a key reason to have a compilation file is to avoid having to re-examine every path, and being able to handle all of the file/folder additions and deletions pretty much eliminates any time savings. Logically, what I am needing would be handled by a "saved-drag" file rather than a "compilation" file. The time saved in daily burns would be the time needed to select the root folders rather than the time spent finding each file to be burned.

My daily-burn path list would probably only be 4 or 5 roots, so the savings would be trivial. Less frequent backups might be much more complicated, so a "saved-drag" concept is useful (That is what I am actually building for myself around the cdebxpcmd.exe program). This effort is not unique to burning DVD backups. Wayyyyy back in DOS, there was an XCopy command that would perform a "smart" copy of folders, not copying what was already there and dated the same or newer. Even with improved performance of recent Windows file systems, there is a lot of improved performance in only copying what has actually changed.

I guess I need to burn a few tests discs.... Thanks for the clue.

Share this post


Link to post
Share on other sites

Short answer to my own question, after some testing, is that (however it does it) loading a previous compilation with the proper option set does add new subfolders placed at various levels in the original test folder whose compilation was saved. I have not examined performance - whether saving a compilation which will probably have changes is faster to load and update than just adding the folders fresh each time. It doesn't seem intuitively like it could be any faster, but if not much slower, the convenience would be well worth it.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
Sign in to follow this  

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.