Re: Starting to contribute.



Hi Owen

> There are two basic reasons why server side extension is hard: one is
> simply that to do any hacking on the server side requires getting a test
> instance of the server going, and that's a quite a project. The other is
> that the emphasis for the data model has been exporting the existing
> server objects via the data model, so you need to know how we implement
> "existing server objects" and add a new one. And that's no simple
> thing, since it adding little bits of code scattered throughout
> lots of different directories.

How about rolling a VM image like the GDK that Ken VanDine has been
working on. Of course i'm assuming everything is free and open source,
with no horrible licensing issues that would prevent packaging...
Which may be foolish of me :-)

In terms of an existing object, why not disect and document the GConf
implementation?

> So, how to fix this and make it easy for people to implement their own
> applications with an online component, without becoming experts in a
> complicated Java server? One approach is to punt and say that that the
> data model isn't involved for applications.  After all, we need to work
> with external non-GNOME web applications. People should just implement
> server components as a normal web app. And the simplest version of this
> is to use an existing application. I think the suggestion of basing a
> TODO list on rememberthemilk.com is a good one.

I don't like this approach for the simple reason that it sucks at
pushing back to me. I have to have a timed "wake up and scrape an API"
thing going on. Yucky.

> Alternatively, we could create some sort of "simple data storage API"
> that maybe wouldn't allow all the features described above, but would
> work fine for a basic TODO list.

Thats what i'd had in mind with the simple object i mentioned in an
earlier post...

> In fact we already have this (!); if
> you just store your TODO list items in GConf and add a line to:
>
>  online-desktop/online-prefs-sync/online-prefs-sync.synclist
>
> (Or install a sync file for your application), then the TODO list items
> will automatically be propagated between all of your desktops. The
> downside is that conflict resolution is very simple to non-existent, so
> it's not going to handle offline TODO list editing on multiple machines
> well.

What if i want to sync data from an existing application? With Conduit
I could sync from Evolution and Tomboy in to online desktop via fake
GConf keys. But they wouldnt be real GConf keys and i dont want them
in my gconf as well as in evolution/tomboy... (Note that we could do
this by converting contacts to actual gconf keys, or by pretending to
be gconf when talking to online desktop - i prefer there to be no
middle man)

In this scenario, the Conduit always up to date stuff would work with
Tomboy (and Evo too with a bit of love on the python bindings front)..
so as you update your tomboy tasks they are pushed to online desktop
which can push them to other things looking at the generic object
type.. Conduit on other machines could pick up notifications and
update their Tomboys and Evolutions. Conduit has conflict resolution
too...

I would push for having an even more generic object type than trying
to squeeze contacts and event data into gconf keys though.

> But there is months of work there, so it's not going to happen quickly.
> For right now I think rememberthemilk or maybe the GConf approach is the
> right one.

Like i said before, Conduit will likely sync to rememberthemilk
anyway. And hiveminder. And any other website our users might care
for. I'm definitely interested in pushing data at online desktop
though, but hope you can be swayed to add a more generic object type
of (uid, lastModified, type, blob) - the type allowing me to ask for
just one type of blobs e.g. contacts, tomboy etc and the lastModified
allowing us to ask for a list of all contacts and work out "x was
added, y was modified and z was deleted. Notifications can then be
used to sync the local client until the current session ends...

> A note about backup: we do have nightly backups of the entire
> online.gnome.org data set. So worse case, we lose 24 hours of data. But
> I think what Colin was getting at is that if you are storing terabytes
> of critical data for people, you have to think about the whole problem
> of failure and recovery differently: you inevitably will have all sorts
> of hardware failures, and you need to have data stored redundantly and
> be able to route them without blinking. We aren't set up to do that sort
> of thing. And therefore (at least in the short term), we shouldn't be
> putting critical data on online.gnome.org.

So if Conduit did go with this approach we would put a warning in the
GUI that for now its in development and thus experimental and for
convenience - don't rely on it. Maybe even with a "you must type I
AGREE" dance.. Yucky :-)

Thoughts?
John


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]