So, I bought one of these things to use with my first personal Arduino-based project. Have it wired up to a prototype board now. I'm using the display library from Adafruit, which seems to work ok, as far as it goes. So, here's my question: when I put pixel values into an array of bytes, and tell the LCD to display them, it gets them quite mixed up. It seems to help a little bit to actually order the array so that your pixels are listed in order top->bottom,left->right -- as opposed to the more conventional reverse of that, where you would put one row after the next. Still, even this won't get the image quite right. Does anyone have any idea what this display is trying to do with my pixel data?
There do appear to be one standalone and two online image converters, and the output of these converters -- at least, the output of
this one -- displays properly on the screen. I can't quite figure out how to do my own conversion and get it to work properly, though, and there appears to be absolutely no available information on the process. I have managed to come up with a correct conversion this 16x16 pixel image(
), but when I try to scale things up, it breaks for some reason. Does anyone know what these displays expect in terms of image data? Also, does anyone know why the libraries haven't been specifically written to treat said data in a more sane manner?
It looks like these guys are using a setpixel function, but if my reading of the code is correct, they're really setting the location in the array, and not the location in the display; they just let the display shuffle things around on you. Is there some kind of standard method of pixel-shuffling of which I'm unaware?
Chris
Comments
Chris
you hand it is a column (yes, really) of 8 bits, but the columns are
written out from left to right, then top to bottom. So the first byte
represents the top, left column of 8 bits. The second represents the
one to the right of it, and so on. It's quite odd. It draws it back
out with this code: Anyway, once you arrange your pixels in this way, they do show up in the correct spot on the screen.
Chris
Chris
I remember that some BBC micro display modes used 8-pixel rows stacked in 8s so that 8 bytes made a character. But then you are limited to 8-pixel wide characters. By using 8-bit columns you make it easy to write fonts 8-pixels high by any arbitrary width. I suppose that for less height you would still have one byte per column in the font definition and just mask off some of the bits when you display it.
I don't imagine that this will help you at all but I would guess at this as the reason.
Chris
Anyway, that being said, it's now _just_ an aesthetic argument, since everything is working. I'm looking at the way it handles text now, which has its own oddities. The font that's built in is a 5x7 pixel font. The entire font is packed into an array, and the text handling functions have the 5x7 hard-wired into them. They'll resize it in multiples of two by drawing filled squares on the display instead of pixels, but that's pretty much it. I'm thinking about trying to load in a larger font (at least to display numerics) and also support alternate languages, at least in such a way that you could use the display to render non-ascii characters in some way.
Chris