Maintenance notice: These forum archives are read-only, and will be removed shortly. Please visit our forums at their new location,

Automated drawing bot help

I am currently developing an open source (GNU licence) automated drawing bot for the axidraw machine which uses the EiBot board. The project is designed so that there is no human interaction apart from peeling the note with the drawing on it. 
The biggest challenge that I am having is to find a way to automate the drawing process. 

I have successfully manually gone though the entire process generating the desired svg, I then imported the svg into inkscape and using WaterColorBot extension to draw it. Also I have used robopaint, but the waterColorBot is giving better results.

I am wondering whether I have missed something that already has this functionality, or where is a good place to start from and what code could be helpful for this project as there is a lot to sift though and you guys know this domain. 

Any suggestions will be greatly appreciated 


  • Can you please clarify what your questions are?

    The Axidraw is already a drawing bot, already automated, so it's not clear what mean when you say that you are developing an automated drawing bot for it.

    I don't know what "peeling the note" means. 

    You ask whether you have missed something that already has "this functionality", but haven't said what "this functionality" is. Can you please say what functionality it is that you feel is missing?

  • edited August 2015
    I have only spent a few days working on this project so I guess I don't have the correct terminology sorry. 

    Just to clarify everything:
    What I need to do is write a program which will take an svg file and draw it onto a post-it note for a human to peel.
    The functionality is that I need to take the svg file and then get the EiBot to draw the file.
    I have spent the last few days trawling through all the code trying to gain a better understanding of the entire process.

    • Is the wbc drawing components dependant on inkscape? If so would it be scriptable to specify a svg to be drawn?

    • Does any of the other software have little/no dependancies on user input in their drawing components?

    • Can I run any of your software through a terminal specifying a svg file either via terminal or code.

    Once I know this I can determine whether I am heading in the right direction.

  • edited August 2015
    > Is the wbc drawing components dependant on inkscape? If so would it be scriptable to specify a svg to be drawn? 

    The WaterColorBot driver for Inkscape is built as an Inkscape extension, and does rely upon Inkscape. So far as I know, Inkscape does not provide a scripting ability that can automatically call this extension.

    (1) It is only one of several available interfaces that can be used to control the WaterColorBot/AxiDraw/EggBot (or other machines based upon the EBB), and 
    (2) The Inkscape extension is a plain-text python file, and the Inkscape features that it uses are almost exclusively to manage its GUI. If you wanted to, it should be reasonably straightforward to pick out the "good parts" and call them from a different context in a python script.
    (3) It may be possible to use a macro-script to control Inkscape, even outside of its normal intended scripting ability. What would you actually need to automate? Two menu items, ultimately: File> Revert and Extensions > Previous Extension. Those two in combination with changing the contents of the SVG file programatically would be sufficient to do what you need.

    Obviously, for RoboPaint and RoboPaint RT, Inkscape is not whatsoever involved.

    > Does any of the other software have little/no dependancies on user input in their drawing components? 

    I'm afraid that I don't understand this question. Which other software, and what "drawing components" do you mean?

    > Can I run any of your software through a terminal specifying a svg file either via terminal or code.

    Yes. For example, by using RoboPaint's API, which allows you to just send an SVG file to print.  

    However, in many cases it is nicer to go through Inkscape, for example if you want to give it exact curves to trace, rather than an SVG to try and approximate. RoboPaint and the Inkscape extensions work very differently "under the hood."  Since you're programmatically generating the data, you might also want to skip the SVG stage altogether, and directly control the EBB through the CNCServer API built into RoboPaint-- that allows direct "goto x,y" type control over the behavior of the robot.

  • Thanks for the help,

    I decided to go with using the API as it was the quickest way to get the system up and running. 

    However when the software receives and completes the job in the API mode it returns back to the normal paint mode instead of back into the API job listener. 

    Is it possible to be returned back into the listener state instead of the normal paint mode once it completes the drawing? 

  • It is not currently possible through that API.  

    However, I have submitted a feature request for it, because it's a good idea: 

    Also, if you're handy with javascript, you may find it straightforward to disable the automatic return from Remote Print mode. I'd start looking at main.api.js.
Sign In or Register to comment.