By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
429,101 Members | 1,335 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 429,101 IT Pros & Developers. It's quick & easy.

python for microcontrollers

P: n/a
Hi all,

I'm currently tackling the problem of implementing a python to assembler
compiler for PIC 18Fxxx microcontrollers, and thought I'd open it up
publicly for suggestions before I embed too many mistakes in the
implementation.

The easy part is getting the ast, via compiler.ast. Also easy is
generating the code, once the data models are worked out.

The hard part is mapping from the abundant high-level python reality to
the sparse 8-bit microcontroller reality.

I looked at pyastra, but it has fatal problems for my situation:
- no backend for 18fxxx devices
- only 8-bit ints supported

I'm presently ripping some parts from the runtime engine of a forth
compiler I wrote earlier, to add support for 8-32 bit ints, floats, and
a dual-stack environment that offers comfortable support for local
variables/function parameters, as well as support for simpler and more
compact code generation.

Python is all about implicitly and dynamically creating/destroying
arbitrarily typed objects from a heap. I've got a very compact
malloc/free, and could cook up a refcounting scheme, but using this for
core types like ints would destroy performance, on a chip that's already
struggling to do 10 mips.

The best idea I've come up with so far is to use a convention of
identifier endings to specify type, eg:
- foo_i16 - signed 16-bit
- foo_u32 - unsigned 32-bit
- bar_f - 24-bit float
- blah - if an identifier doesn't have a 'magic ending', it will
be deemed to be signed 16-bit

also, some virtual functions uint16(), int16(), uint32(), int32(),
float() etc, which work similar to C casting and type conversion, so I
don't have to struggle with type inference at compile time.

Yes, this approach sucks. But can anyone offer any suggestions which
suck less?

--
Cheers
EB

--

One who is not a conservative by age 20 has no brain.
One who is not a liberal by age 40 has no heart.
Aug 8 '05 #1
Share this Question
Share on Google+
14 Replies


P: n/a
Evil Bastard wrote:
I'm currently tackling the problem of implementing a python to assembler
compiler for PIC 18Fxxx microcontrollers


Perhaps porting Pyrex would be easier. Pyrex takes a python-like syntax
(plus type information, etc.) and emits C, which is then compiled.
--
Benji York
Aug 8 '05 #2

P: n/a
Benji York wrote:
Perhaps porting Pyrex would be easier. Pyrex takes a python-like syntax
(plus type information, etc.) and emits C, which is then compiled.


Pyrex totally rocks. But for the PIC targetting, no can do:
- pyrex generates a **LOT** of code, which makes extensive use of the
python-C api, which is inextricably tied to dynamic objects
- PIC program memory is 32kbytes max

Thanks all the same.

Any other suggestions?

--
Cheers
EB

--

One who is not a conservative by age 20 has no brain.
One who is not a liberal by age 40 has no heart.
Aug 8 '05 #3

P: n/a
Evil Bastard wrote:
Benji York wrote:
Perhaps porting Pyrex would be easier.
Pyrex totally rocks. But for the PIC targetting, no can do: .... Any other suggestions?


Yes, port Lua instead. Lua is pretty much designed for this sort of
application, and is probably "Pythonic" enough to provide whatever
advantages you were trying to get from using Python, short of it
actually being Python.

FWIW, the interpreter/virtual machine appears to be designed to be very
conservative with memory, though there's a chance it is somewhat tied to
having 32-bit integers available (yet could perhaps be ported easily to
some freaky 12-bit microcontroller anyway :-) ).

-Peter
Aug 8 '05 #4

P: n/a
Evil Bastard <sp**@me.please> writes:
Yes, this approach sucks. But can anyone offer any suggestions which
suck less?


I don't think you want to do this. Runtime type tags and the overhead
of checking them on every operation will kill you all by themselves.
Processors like that haven't been used much as Lisp targets either,
for the same reasons. Pick a different language.
Aug 8 '05 #5

P: n/a
How about just helping this project:

http://pyastra.sourceforge.net/

I know he's trying to rewrite it to work across multiple uC's (AVR,msp430 etc)

HTH,

Guy

Evil Bastard wrote:
Hi all,

I'm currently tackling the problem of implementing a python to assembler
compiler for PIC 18Fxxx microcontrollers, and thought I'd open it up
publicly for suggestions before I embed too many mistakes in the
implementation.

The easy part is getting the ast, via compiler.ast. Also easy is
generating the code, once the data models are worked out.

The hard part is mapping from the abundant high-level python reality to
the sparse 8-bit microcontroller reality.

I looked at pyastra, but it has fatal problems for my situation:
- no backend for 18fxxx devices
- only 8-bit ints supported

I'm presently ripping some parts from the runtime engine of a forth
compiler I wrote earlier, to add support for 8-32 bit ints, floats, and
a dual-stack environment that offers comfortable support for local
variables/function parameters, as well as support for simpler and more
compact code generation.

Python is all about implicitly and dynamically creating/destroying
arbitrarily typed objects from a heap. I've got a very compact
malloc/free, and could cook up a refcounting scheme, but using this for
core types like ints would destroy performance, on a chip that's already
struggling to do 10 mips.

The best idea I've come up with so far is to use a convention of
identifier endings to specify type, eg:
- foo_i16 - signed 16-bit
- foo_u32 - unsigned 32-bit
- bar_f - 24-bit float
- blah - if an identifier doesn't have a 'magic ending', it will
be deemed to be signed 16-bit

also, some virtual functions uint16(), int16(), uint32(), int32(),
float() etc, which work similar to C casting and type conversion, so I
don't have to struggle with type inference at compile time.

Yes, this approach sucks. But can anyone offer any suggestions which
suck less?

Aug 8 '05 #6

P: n/a
Paul Rubin wrote:
I don't think you want to do this. Runtime type tags and the overhead
of checking them on every operation will kill you all by themselves.
Processors like that haven't been used much as Lisp targets either,
for the same reasons. Pick a different language.


I was thinking that you could avoid this by adding some type inference
to Python and/or reducing everything to two basic types (strings and
ints), but the end result would require a lot of work and not look much
like Python.

For a PIC, I'm not even sure I'd want to use C++. Then again, I'm not
much of a fan of BASIC stamp modules, either, so...
Aug 8 '05 #7

P: n/a
Hi Bastard,

one of the main reasons PyPy gets funded by the EU was the promise to
port Python to embedded systems ( but not necessarily very memory
restricted ones ). The project seems to be in a state where the team
tries to get rid of the CPython runtime alltogether and reaching some
autonomy. The idea of providing a Forth backend would fit into the
concept and would be great IMO.

Maybe You should also have a look on the JavaCard platform
specification for inspiration what is possible for an OO language on
time and memory resticted platforms.

http://java.sun.com/products/javacard/specs.html

Since there are vendors that implemented the JVM on smartcards against
SUNs spec and JavaCards are on the market ( not just research
prototypes ) the approach has been proven successfull. For the matter
of memory layout I would naively try to adapt the type inferencer (
annotator in PyPy slang ) and restrict Python to an even more smaller
subset than RPython ( SmartJava does not even support strings. On the
other hand the explicit casting to short and byte types is by no means
braindead ). I currently don't have an idea for memory management and
I'm not even sure about implicit or explicit finalization.

Kay

Aug 9 '05 #8

P: n/a
David Cuthbert wrote:
Paul Rubin wrote:

I don't think you want to do this. Runtime type tags and the overhead
of checking them on every operation will kill you all by themselves.
Processors like that haven't been used much as Lisp targets either,
for the same reasons. Pick a different language.


I was thinking that you could avoid this by adding some type inference
to Python and/or reducing everything to two basic types (strings and
ints), but the end result would require a lot of work and not look much
like Python.

Use Fort.
Aug 9 '05 #9

P: n/a
Paul Rubin wrote:
I don't think you want to do this. Runtime type tags and the overhead
of checking them on every operation will kill you all by themselves.
Processors like that haven't been used much as Lisp targets either,
for the same reasons. Pick a different language.


On thinking about it, you might be right.

Unless one implements a full dynamic OO engine, the prospect of
implementing python for microcontrollers resembles the task of moving a
palace's furnishings into a trailer park - you'll get a couple of things
into the trailer, but will have to leave the rest outside in the mud and
the rain.

Maybe I should clean up my forth compiler instead, and get it ready for
the prime time.

--
Cheers
EB

--

One who is not a conservative by age 20 has no brain.
One who is not a liberal by age 40 has no heart.
Aug 9 '05 #10

P: n/a
Evil Bastard wrote:
Paul Rubin wrote:
Pick a different language.


Maybe I should clean up my forth compiler instead, and get it ready for
the prime time.


In searching for an existing Lua virtual machine** for the PIC,
following on my previous posting, I came across several references to
Forth implementations for the PIC. Maybe rolling your own isn't
necessary here.

-Peter

** I was rather surprised to discover a distinct lack of evidence that
anyone has a Lua VM for the PIC. There appear to be traces of
interpreters written for chips of similar scale to the PIC 18F series,
but none specifically for the PIC. So while it would probably make an
interesting project, and I'm pretty sure it's quite feasible, grabbing
an off the shelf Forth might be a more productive use of your time.
Aug 10 '05 #11

P: n/a
Peter Hansen wrote:
So while it would probably make an
interesting project, and I'm pretty sure it's quite feasible, grabbing
an off the shelf Forth might be a more productive use of your time.


Heh, methinks one might be misunderstanding the Forth culture.

Forth compilers are like poetry, in that the number of available works
exceeds the number of users. :)

Forth can, and so often does, disappear up its own ring-hole in the
blink of an eye. Its hardcore extensibility is so often its own downfall
since. with the way many people work with it. it earns its reputation as
a 'write-only language' over and over again. Quite often, to understand
a piece of Forth code, you have to crawl in and out of all the author's
body cavities many times.

What I'm saying is that it often takes less time to write a Forth than
to properly learn and understand someone else's implemention. In this
way, Forth is like undergarments - you can admire those worn by others,
but you sure as hell don't want to wear them yourself. :P

--
Cheers
EB

--

One who is not a conservative by age 20 has no brain.
One who is not a liberal by age 40 has no heart.
Aug 10 '05 #12

P: n/a
Peter Hansen wrote:
So while it would probably make an
interesting project, and I'm pretty sure it's quite feasible, grabbing
an off the shelf Forth might be a more productive use of your time.


Heh, methinks one might be misunderstanding the Forth culture.

Forth compilers are like poetry, in that the number of available works
exceeds the number of users. :)

Forth can, and so often does, disappear up its own ring-hole in the
blink of an eye. Its hardcore extensibility is so often its own downfall
since. with the way many people work with it. it earns its reputation as
a 'write-only language' over and over again. Quite often, to understand
a piece of Forth code, you have to crawl in and out of all the author's
body cavities many times.

What I'm saying is that it often takes less time to write a Forth than
to properly learn and understand someone else's implemention. In this
way, Forth is like undergarments - you can admire those worn by others,
but you sure as hell don't want to wear them yourself. :P

--
Cheers
EB

--

One who is not a conservative by age 20 has no brain.
One who is not a liberal by age 40 has no heart.
Aug 10 '05 #13

P: n/a
Evil Bastard wrote:
Peter Hansen wrote:
grabbing
an off the shelf Forth might be a more productive use of your time.
Heh, methinks one might be misunderstanding the Forth culture.


Lacking entirely in any knowledge of it whatsoever would be a more
accurate description. "Ignorant of" is even shorter, though no less
accurate. ;-)
Forth can, and so often does, disappear up its own ring-hole in the
blink of an eye. Its hardcore extensibility is so often its own downfall
since. with the way many people work with it. it earns its reputation as
a 'write-only language' over and over again. Quite often, to understand
a piece of Forth code, you have to crawl in and out of all the author's
body cavities many times.
Okay, fine. I can accept that Forth code can be inscrutable. A belief
that that would be the case is one reason I've actually never really
done any Forth.

But I thought _you_ were the one who brought up Forth, so clearly you
can't be against it on basic principles.
What I'm saying is that it often takes less time to write a Forth than
to properly learn and understand someone else's implemention. In this
way, Forth is like undergarments - you can admire those worn by others,
but you sure as hell don't want to wear them yourself. :P


Okay, but this implies either that Forth implementations are written in
Forth (and no, e.g. C) or that people who implement Forth in, say, C let
their Forth-inspired inclination to write inscrutable code bleed over
into their C code as well.

So unless you're the only person around able to resist writing a poor
implementation of Forth, I'm still puzzled why you would want to write
your own when existing implementations could actually be just as
readable and useful (or more so?) than whatever you could whip up as a
one-off for this project.

(Not trying to argue, just understand, because it looks like you're
conflating Forth programs with Forth implementations, or perhaps I'm
even more ignorant than noted above and am missing a key point. :-)

-Peter
Aug 10 '05 #14

P: n/a
Peter Hansen wrote:
(Not trying to argue, just understand, because it looks like you're
conflating Forth programs with Forth implementations, or perhaps I'm
even more ignorant than noted above and am missing a key point. :-)


It's decades since I coded Forth, but I suspect that Forth compilers
typically behave differently. On the other hand Forth is very simple,
so it's probably not rocket science to write a new Forth compiler. It
doesn't surprise me if it's simpler to write your own Forth compiler
than to use one written by someone else--particularly.

For a tiny device such as a PIC, it's not as if you will load a lot
of third party libraries into your runtime environemt. I assume I/O
functionality you desire would be implemented in the compiler, and
it might well vary depending on your application how these are best
designed.
Aug 10 '05 #15

This discussion thread is closed

Replies have been disabled for this discussion.