[Author Prev][Author Next][Thread Prev][Thread Next][Author Index][Thread Index]

Re: gEDA-user: next PCB release - 1.99za vs 4.0



On Thu, 2010-09-09 at 08:37 -0700, Andrew Poelstra wrote:

> > I was never really sure what point to start sub-classing to make a new
> > widget. My acid test was usually whether that widget had useful
> > self-contained functionality which could be re-used in other places.
> >
> > For example, I've got a moderately large refactor queued which removes
> > most of the drawing back-end specific code from the GTK hid, splitting
> > all the GDK specific rendering details into a separate file. With GL
> > enabled, it removes that file from the build and swaps in an equivalent
> > which uses OpenGL. All render-specific member variables are defined
> > locally to those files, avoiding the GDK/GL choice cluttering up more
> > global structures.
> > 
> > When devising this split, I was very tempted to turn the PCB drawing
> > area into its own widget, and teach that to draw (and potentially have
> > two implementations / sub-classes), but so far I didn't take that route.
> > There is code in PCB which depends GDK / GL rendering, but does not
> > renders not to a widget, rather to a GdkPixbuf.
> > 
> 
> I think I would have done it the OO way, only to keep things consistent
> with how glib works. Your way is the "natural" C way to do things.

The renderer can have non-OO helper functions too I guess, but it felt
odd putting that into a file with a widget implementation. I had hoped
to keep it one file for GDK specific, and one file for GL specific code.

I'm still sorely tempted to make a widget out of the PCB view area at
some point, but it feels difficult to detach from the core.

Even if you code the "view" part of the model/view/controller pattern,
you still have to hook in the controller code (which is different
depending on what kind of widget / editor you are presenting).

There "almost" is a renderer widget, for previewing footprints. (Ok,
there is one, but it is pretty nasty). The widget currently has to reach
into the core of PCB, and relies a lot on how the rendering model at
PCB's core works.

Each widget which draws "PCB" things, be that the main drawing area, the
pinout preview, or the library selector.. to render, that widget saves a
bunch of global state in PCB's core, then calls a drawing request
function, and the core then fires back into callbacks the GUI has
registered. (Drawing calls for rendering the PCB geometry).

Rather more irkingly, the core sometimes calls these when the GUI
doesn't want them (at least in the case of the GL version), which has
different rules about how / when the screen is painted. The core assumes
a particular rendering / update model shared by the old X11 / Lesstif /
GDK HIDs.

I've been slowly gathering the energy to deal with this, and have
already pushed some of the object rendering into higher level
call-backs, such as "draw_pcb_polygon" rather than having the core
assume the GUI wants to be passed a particular type of polygon data.

The heads of my GL branches mostly use their own (copied) versions of
the core's drawing code so various ordering assumptions don't cause
problems. I'm hoping eventually to come up with a solution which lets
the GUIs have more independence to how they render the board. Ideally
this can be done without having to carbon-copy the drawing code into
every HID.

(You may notice I've started in a few places pulling common drawing code
into a common_* prefix under the hid/ folder). My aim was to set it up
such that the GUIs could take or leave various bits of the common
rendering code as their authors desire.

Best regards,

-- 
Peter Clifton

Electrical Engineering Division,
Engineering Department,
University of Cambridge,
9, JJ Thomson Avenue,
Cambridge
CB3 0FA

Tel: +44 (0)7729 980173 - (No signal in the lab!)
Tel: +44 (0)1223 748328 - (Shared lab phone, ask for me)



_______________________________________________
geda-user mailing list
geda-user@xxxxxxxxxxxxxx
http://www.seul.org/cgi-bin/mailman/listinfo/geda-user