Merge lp://staging/~akretion-team/openobject-server/openobject-server_5.0_patches into lp://staging/openobject-server/5.0
- openobject-server_5.0_patches
- Merge into 5.0
Status: | Merged | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Merge reported by: | Husen Daudi | ||||||||||||
Merged at revision: | not available | ||||||||||||
Proposed branch: | lp://staging/~akretion-team/openobject-server/openobject-server_5.0_patches | ||||||||||||
Merge into: | lp://staging/openobject-server/5.0 | ||||||||||||
Diff against target: |
1103 lines 4 files modified
bin/osv/orm.py (+87/-102) bin/tools/convert.py (+166/-178) bin/tools/translate.py (+21/-21) bin/wizard/__init__.py (+8/-9) |
||||||||||||
To merge this branch: | bzr merge lp://staging/~akretion-team/openobject-server/openobject-server_5.0_patches | ||||||||||||
Related bugs: |
|
||||||||||||
Related blueprints: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Husen Daudi (community) | Approve | ||
Stephane Wirtel (OpenERP) | Approve | ||
Joël Grand-Guillaume @ camptocamp (community) | Approve | ||
Fabien (Open ERP) | Approve | ||
Christophe CHAUVET (community) | Disapprove | ||
Review via email: mp+14112@code.staging.launchpad.net |
Commit message
Description of the change
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
Dukai Gábor (gdukai) wrote : | # |
IMHO this is a move like the postgres compatibility fixes that has to be taken anyway in the lifetime of the 5.0 branch. It's better if it's done now than a few months later.
Daniel Baumann (daniel-debian-org) wrote : | # |
> IMHO this is a move like the postgres compatibility fixes that has to be taken
> anyway in the lifetime of the 5.0 branch. It's better if it's done now than a
> few months later.
+1.
this is also quite important for the debian maintenance of openerp: no (almost) bugfree packages working with lxml means no openerp in squeeze - no openerp in squeeze equals most likely to no openerp in ubuntus next release.
or in otherwords: for the sake of the debian and ubuntu openerp users, please consider applying.
Simone Orsi (simone-orsi) wrote : | # |
+1
Christophe CHAUVET (christophe-chauvet) wrote : | # |
Hi
I desagree with this, The 5.0 stable must only received bugfix, and this issue is not a bug.
I consider the stable version of the server, must be run on the stable version of Debian (actual Lenny) and Ubuntu (Actuel Hardy (8.04 LTS)).
Otherwise if GTK client doesn't work with Karmic, then yes we must correct the issue.
5.2 must be freeze in December, and RC available (March/April) for the Ubuntu latest stable (10.04 LTS)
Regards,
Dukai Gábor (gdukai) wrote : | # |
That RC next March and April will hardly be usable in production for most of the people.
5.0 already had much more serious changes than this fix.
Numérigraphe (numerigraphe) wrote : | # |
> 5.2 must be freeze in December, and RC available (March/April) for the Ubuntu
> latest stable (10.04 LTS)
That's a wrong reason because:
- the Debian maintainer says Debian is going to drop openerp altogether if it's not fixed
- debian freezes in december so the next stable will ship either with an early alpha, or with no openerp at all
- debian imports for ubuntu lucid LTS freeze in February so it will ship with a beta at best, or with no openerp at all
Dukai Gábor (gdukai) wrote : | # |
5.0 has a life cycle of 5 years. That means the users of 5.0 will have to stay with lenny and hardy during all five years.
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
Hello Christophe,
I agree that the change is not small. But don't you agree it's testable?
Frankly using it since yesterday no issue at all. I think some of us could
confirm it safe and after that only we merge it. What I think is that this
is worth the effort, just saying fuck once more to all Ubuntu users for the
next say 6 months will by no means help the product popularity...
I insist, this is not like the report rewrite situation. No logic is changed
here, it's just about swapping a lib for an other. All changes code lines
are easily tested, available here, why not make the effort to give it a test
and then merge it eventually?
My 0.02cts
On Thu, Oct 29, 2009 at 6:28 AM, Christophe Chauvet - http://
<email address hidden> wrote:
> Review: Disapprove
> Hi
>
> I desagree with this, The 5.0 stable must only received bugfix, and this
> issue is not a bug.
> I consider the stable version of the server, must be run on the stable
> version of Debian (actual Lenny) and Ubuntu (Actuel Hardy (8.04 LTS)).
>
> Otherwise if GTK client doesn't work with Karmic, then yes we must correct
> the issue.
>
> 5.2 must be freeze in December, and RC available (March/April) for the
> Ubuntu latest stable (10.04 LTS)
>
> Regards,
>
>
> --
>
> https:/
> You proposed
> lp:~akretion-team/openobject-server/openobject-server_5.0_patches for
> merging.
>
Christophe CHAUVET (christophe-chauvet) wrote : | # |
> 5.0 has a life cycle of 5 years. That means the users of 5.0 will have to stay
> with lenny and hardy during all five years.
Do you know the word "Production" ?
In production we doesn't change all 5 minutes the Ubuntu server because there is a new version.
I have in production, a server that run with Debian Sarge, it work beautiful all the day.
Apply this fix may introduce a dangerous (and uncontrolled) regression, this is a risk that i doesn't want to run.
Regards,
Christophe CHAUVET (christophe-chauvet) wrote : | # |
Hi Raphael
> I agree that the change is not small. But don't you agree it's testable?
Yes we can test it, to find all regression, but don't apply in 5.0 (see my last answer)
> Frankly using it since yesterday no issue at all. I think some of us could
> confirm it safe and after that only we merge it. What I think is that this
> is worth the effort, just saying fuck once more to all Ubuntu users for the
> next say 6 months will by no means help the product popularity...
For me, Ubuntu 9.10 is ready for Desktop, but i didn't take the risk to install a server with this version, and prefer 8.04 LTS.
> I insist, this is not like the report rewrite situation. No logic is changed
> here, it's just about swapping a lib for an other. All changes code lines
> are easily tested, available here, why not make the effort to give it a test
> and then merge it eventually?
If i listen this in geek mode, then yes we must apply the merge, but if i listen this in integrator mode, i don't want to see this in stable,
Do you have to test this with Redhat 5.X version (Only python 2.4.3) ? i have test and i doesn't work directly, for customer, we must compile ne python version 2.5, easy_install all packages, but NO CONTRACT SUPPORT FROM REDHAT :( . My reflexion is "Why Debian/Ubuntu and not RedHat/Centos ?"
Regards,
Christophe CHAUVET (christophe-chauvet) wrote : | # |
And i forgot, thanks Raphael for this good and meticulous works,
Regards,
Nicolas Évrard (nicolas-evrard) wrote : | # |
We are not sure about the fate of this change because of the lack of unittests.
A good test set (ie one that test every feature) would prove that switching from a library to another does not have an impact on the overall stability of openerp.
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
Guys,
We all want unit tests, sure. The current one at test.openerpobj
not enough yet, that sure. But nvertheless WE CAN TEST IT OURSELVES, even if
it's more work, I think for exceptionnal things it's valid too. Again, by
knowing OpenERP quite a bite, I affirm that it's quite easy to ensure we
pass in every line changes I made during field_view_get calls (calling
views) and translation opérations. There is no need to tweak any very
special business workflow to do that, we just need to test the regular
things, and more importantly call all the views we can, those will alone
call all those lines, I think others can confirm.
Christophe, I perfectly agree that in production you can use some oldy
Debian or Ubuntu, even if not all customer will like it (remember, lot's of
customers are very small company that already have trouble using Linux,
asking them to use Debian or not the easiest Ubuntu they have on their
Desktop is not appealing to them).
Now, 50% of the reason I want this applied, is US DEVELOPERS/
the all the new potential developpers OpenERP can attract to pound in behind
the tests and contributions. Lot's of us/them will want to use a reasonably
recent Ubuntu distro on their Desktop, and lot's of them will have only one
laptop. By sayingt fuck to them for Hardy, we are certainly loosing lot's of
them for Openbravo, Tryton and god knows what other package.
5.2 might happen in late December (BTW, I'm concerned because I find this
quite soon and the community effort has not been coordinated yet, lot's of
required refactoring might not be undertaken unfortunately), Just like me,
you know perfectly that the slow one year dev cylces Tiny choosed make that
5.2 will be usable in prod only around February at best, because it will
start with lot's of bugs as usual. The more integrators know OpenERP/Tiny,
the more cautious they are to the new version that are developped in the
Indian dark during one year, receiving almost no test at all. Last year,
Smile participated to the 5.0 dev effort, deploying it by December 2008
already, we had to tackle A LOT of bugs with little reward because stability
happened only around March 2009. Meanwhile, I remember the experienced
CampToCamp guys laughing at my face, saying that they will stat testing 5.0
only by February 2009, because they couldn't afford loosing that much time
with bugs/regressions. So my point is that yes, we should take care of 5.2
right now, but it won't be usable any time soon. And in those coming 6
months, I think OpenERP will just suffer a lot in term of image if they
don't have an easy way to install a dev env of OpenERP easily enough on the
soon most popular Linux distro.
Moreover, as Numérigraphe and others say,
doing nothing WILL NOT result in increased stability. It will result in
Debian/Ubuntu dropping recent OpenERP support and offer only very alpha
unstable packages of OpenERP (like 5.0.3) or port it using very dubious
debdiff files Debian/Ubuntu maintainer have no skill to check, meaning bugs
WILL occur on Ubuntu distro; and as I said, this won't solve the dev
community issue, but rather frighten those devs.
Finally, if you look carefully at that bra...
Fabien (Open ERP) (fp-tinyerp) wrote : | # |
I don't have time to test the patch, I let the quality team do this work. But, following the discussion, I think we should merge the proposed patch. Keeping an old library may leads to more troubles.
The automated tests from test.openobject.com will test a lot of things (all fields_view_get methods are tested at each commit). I would suggest the quality team to launch the automated tests from the base_module_quality tests on all official module. If it passes manual check and automated tests, you should merge.
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
All right,
I'm happy with this position, I urge all serious community members to test
it on their end, to ensure, wa have no Regression. At Akretion, we are
already two testing it fulltime for our regular projects, if something is
wrong with the patch, we will find out in time.
On Thu, Oct 29, 2009 at 12:18 PM, Fabien (Open ERP) <email address hidden> wrote:
> Review: Approve
> I don't have time to test the patch, I let the quality team do this work.
> But, following the discussion, I think we should merge the proposed patch.
> Keeping an old library may leads to more troubles.
>
> The automated tests from test.openobject.com will test a lot of things
> (all fields_view_get methods are tested at each commit). I would suggest the
> quality team to launch the automated tests from the base_module_quality
> tests on all official module. If it passes manual check and automated tests,
> you should merge.
> --
>
> https:/
> You proposed
> lp:~akretion-team/openobject-server/openobject-server_5.0_patches for
> merging.
>
Cloves Almeida (cjalmeida) wrote : | # |
+1 for the merge.
But, for a project this size, the lack of unit testing is starting to hurt.
Why don't Tiny and the community organizes a "Test Sprint" where we would dedicate some effort writing unit tests for 5.0/5.2? Yes, they are boring but with that number of regressions, and not-so-frequent releases, I worry about next stable release quality.
If Tiny writes some nice testing guidelines, I'm confident that the community would jump in.
Joël Grand-Guillaume @ camptocamp (jgrandguillaume-c2c) wrote : | # |
Hello,
I just tested it into a replication of a production instance and it seems that everything is ok with this one. As people said, the lack of unit test starts to hurts, but for now, we have to deal with that...
I suggest to merge that branch even if we should avoid those kind of merge into stable branch, but here, the need is to big to let it only for the next versions...
So, for me, it's ok to merge that into stable release.
My2Cents,
Joël
P.S. We are working on a test bot for functional part here at Camptocamp. This will not be unit test, but more process test. Like : create an invoice, validate it and see the result is ok with tax amount, or try to get all partner, etc... I'll hope to post something soon to have some feed-back on it.
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
2009/10/30 Joël Grand-Guillaume @ CampToCamp <
<email address hidden>>
>
>
> P.S. We are working on a test bot for functional part here at Camptocamp.
> This will not be unit test, but more process test. Like : create an invoice,
> validate it and see the result is ok with tax amount, or try to get all
> partner, etc... I'll hope to post something soon to have some feed-back on
> it.
>
Joel, sounds great and is required indeed. Don't know what solution you are
putting in place.
But you should know that the best of bread of Ruby functional testing,
"Cucumber" http://
OOOR http://
OpenERP features. I know this is a different language, but if you don't have
good tools enough on Python, might be worth considering it. Note that it's
current to test Java code with Jython or JRuby... And arguably, Cucumber is
more an easy test DSL rather than plain Ruby.
I was also thinking about an enhanced module_recorder version that would
record each XML/RPC request and deal with database dump/restore to allow a
"replay" like feature to easily allow to create test suites. The same could
be achieved using Selenium + the browser eventually, but might just be a
little to harder to maintain.
All right, please all keep testing that branch so in a few days it could be
merged safely.
Harry (OpenERP) (hmo-tinyerp) wrote : | # |
We are working on function testing :
Our goal:
Each module should has unit test to test functional
like unit test of sale module
-- Create Sale Order with demo data ( Data is provided in unit test)
-- confirm This Sale order
-- Print Sale Order
-- Make Invoice
This Unit test will test by base_quality_module and also test.openobject
You can see test.py file in trunk-addons : sale/unit_
we are using python unittest module (http://
You can see more unit test on this branch :
https:/
After finsh and review this work, we will merge this branch in trunk-addons
Cloves Almeida (cjalmeida) wrote : | # |
There already seems to have lots of testing efforts. Shouldn'tc we try
to converge them?
CJ
On 30-10-2009 10:37, Harry (Open ERP) wrote:
> We are working on function testing :
> Our goal:
> Each module should has unit test to test functional
> like unit test of sale module
> -- Create Sale Order with demo data ( Data is provided in unit test)
> -- confirm This Sale order
> -- Print Sale Order
> -- Make Invoice
>
> This Unit test will test by base_quality_module and also test.openobject
> You can see test.py file in trunk-addons : sale/unit_
>
> we are using python unittest module (http://
>
> You can see more unit test on this branch :
> https:/
>
> After finsh and review this work, we will merge this branch in trunk-addons
>
>
>
>
>
>
>
>
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
Hello Cloves,
I suggest you take a serious look ate the smart framwork released by
CampToCamp.
I believe business tests can't be written in a cleaner way today.
I'll make my best to provide a more generic OpenERP proxy to them based on
OOOR as soon as I'll package it as a GEM and remove
the Rails dependency.
Full story here: http://
On Fri, Oct 30, 2009 at 2:15 PM, Cloves Almeida <email address hidden> wrote:
> There already seems to have lots of testing efforts. Shouldn'tc we try
> to converge them?
>
> CJ
> On 30-10-2009 10:37, Harry (Open ERP) wrote:
> > We are working on function testing :
> > Our goal:
> > Each module should has unit test to test functional
> > like unit test of sale module
> > -- Create Sale Order with demo data ( Data is provided in unit test)
> > -- confirm This Sale order
> > -- Print Sale Order
> > -- Make Invoice
> >
> > This Unit test will test by base_quality_module and also
> test.openobject
> > You can see test.py file in trunk-addons : sale/unit_
> >
> > we are using python unittest module (
> http://
> >
> > You can see more unit test on this branch :
> >
> https:/
> >
> > After finsh and review this work, we will merge this branch in
> trunk-addons
> >
> >
> >
> >
> >
> >
> >
> >
>
> --
>
> https:/
> You proposed
> lp:~akretion-team/openobject-server/openobject-server_5.0_patches for
> merging.
>
Husen Daudi (husendaudi) wrote : | # |
The patch seems working,
I have been using is since 3 days and didn't get any error yet.
But it is not completely remove xml.dom dependency.
It is still exist in
server/
server/
server/
server/
server/
The reason behind this is
from xml.dom import getDOMImplement
I think we don't have similar functionality in etree for getDOMImplement
Can anybody suggest some way to completely remove xml.dom dependency?
any pychart is third party tool so we have to wait for them to remove xml dependency.
Stephane Wirtel (OpenERP) (stephane-openerp) wrote : | # |
If we extract pychart or pypdf, we can have some bugs in the server, with the packages from the distributions. But we can remove some dependencies to xml.dom. For the rest, we can provide a patch.
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
Hello, no error on my side too, using it since some 5 days now.
Still, dukai, reported an error with this view: http://
I'll test that in a few hours.
About xml.dom, no idea.
BUT, I can tell you that using last Ubuntu Karmic this is not an issue.
So I don't know if xml.dom is deprecated or not, but at least it causes no
trouble on recent Ubuntu's/other Linux distros.
Regards,
Raphaël
On Tue, Nov 3, 2009 at 11:31 AM, hda (OpenERP) <email address hidden> wrote:
> The patch seems working,
> I have been using is since 3 days and didn't get any error yet.
> But it is not completely remove xml.dom dependency.
> It is still exist in
> server/
> server/
> server/
> server/
> server/
>
> The reason behind this is
> from xml.dom import getDOMImplement
>
> I think we don't have similar functionality in etree for
> getDOMImplement
>
> Can anybody suggest some way to completely remove xml.dom dependency?
> any pychart is third party tool so we have to wait for them to remove xml
> dependency.
>
> --
>
> https:/
> You proposed
> lp:~akretion-team/openobject-server/openobject-server_5.0_patches for
> merging.
>
Stephane Wirtel (OpenERP) (stephane-openerp) : | # |
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
Again, let's just check the Dukai issue with view http://
before merging.
I have no time now to check. But if one of you can, dukai is hanging right
now on #openobject IRC...
On Tue, Nov 3, 2009 at 11:40 AM, Stephane (Open ERP) <email address hidden>wrote:
> Review: Approve
>
> --
>
> https:/
> You proposed
> lp:~akretion-team/openobject-server/openobject-server_5.0_patches for
> merging.
>
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
All right guys,
I spent a few hours digging into Dukai's error case with adding such a view:
<record id="product_
<field name="name"
<field name="model"
<field name="type"
<field name="inherit_id" ref="product.
<field eval="7" name="priority"/>
<field name="arch" type="xml">
<tree string="Products" position="replace">
</record>
Yes I saw the error and I was unable to find a clean fix.
Now, I tell you, Dukai this is a psycho use case (no offense ;-) )
Indeed, Why the hell would you like to inherit a view to actually replace entirely its fucking content?
IMHO this is plain wrong and never seen such a case.
What you should do instead is just REPLACE the original view by your new definition, that is, simply use the same record id and don't use inherit_id at all.
Of course, REPLACING an existing view is EVIL, just like not calling super when overriding a method. Now, I fully understand why you want to replace it, it's because you would like to cheat the tree attribute and the current OpenERP framework don't let you do that.
This is until why get my blueprint implemented and merged two (all right could be trunk this time if you are hypocritically scared): https:/
The good part of me spending those few hours now is that I now I know how I'll implement that blueprint in the close future.
NOW, LET'S ME SUM IT UP FOR THAT BRANCH:
Dukai found a psycho fail test case. BUT, he should never have done it this way and there is a better way to do it (use same record attribute no inherit_id) AND I NEVER SEEN SUCH A PSYCHO CASE IN THE ADDONS CODEBASE AT LEAST. Moreover, for the psycho's around, the fix will be dead simple.
So I vote for not blocking this merge under the pretext we found a psycho case breaking it.
I tested it for more than one week, several real customer profile/databases and no bug at all unless we try to blend the views in such irrational ways.
All right,
So I think this is enough testing, we had a fear with D...
Husen Daudi (husendaudi) wrote : | # |
Hello Raphaël Valyi,
Your patch has been merged on stable and dukai's problem has been fixed too.
replace whole tree,form.. tag will work now.
please test it once again and send feedback hare.
Thanks for contribution.
HDA.
Raphaël Valyi - http://www.akretion.com (rvalyi) wrote : | # |
Hello,
I'm sorry, but why the hell the commit is a normal pig commit and not a
regular clean MERGE?????
Is that the the way you are improving the codebase?
Is there any good reason why it's not a fucking merge?
Let me explain better, this is how it look right now:
suppose I have OpenERP running in production for critical applications,
so I might track the 5.0 branch whenever I want to update the code/fix the
bugs.
Currently, If I look at rev #1866.1.3, all I see is a huge patch that seems
risk with almost no explanation.
So I would think: hey Tiny is not serious, those guys are just committing
large experimental change on claimed stable releases.
Furthermore, with the commit message, you associate my name to such a mess.
I'm sorry, this is exaclty the same story as here
http://
repeating over and over. Guys, I'm sorry about that, but for successful
OpenERP we all need more professionalism.
Instead, if you did a clean merge, the enlighten user tracking 5.0 branch
that would tomorrow be your community basis
could instead know the history of the commits, were does this comes from
(trunk and would have been better even on trunk to merge original branches
from Almacom or Activity Solutions (sorry can't remember who did it
initially) properly). The enlighten user could also look here, find out
that this was not yet a risky
large random codebase change because this time it has been done in a branch
and tested for at least a week by several people. Should he have a question
he would get in touch with those people.
So please, unless there is a really good reason, re-do the merge properly in
the history and try to improve in the future, this is more important as what
you might think. Who was wondering 3 years ago those mistakes
http://
for so long?
Trying to raise funds? You know what, you'll have experts tracking your
fucking codebase and the more pig it looks, the cheaper you'll sell it. Not
a good strategy.
Sorry for the rant, but I spent time getting this right, so I expect you
behave more professionally to show you do like community contributions.
Regards,
Raphaël Valyi
On Wed, Nov 4, 2009 at 4:35 AM, hda (OpenERP) <email address hidden> wrote:
> Review: Approve
> Hello Raphaël Valyi,
>
> Your patch has been merged on stable and dukai's problem has been fixed
> too.
> replace whole tree,form.. tag will work now.
> please test it once again and send feedback hare.
>
> Thanks for contribution.
> HDA.
> --
>
> https:/
> You proposed
> lp:~akretion-team/openobject-server/openobject-server_5.0_patches for
> merging.
>
Husen Daudi (husendaudi) wrote : | # |
Hello Raphaël Valyi,
Merged your branch properly now.
sorry for inconvenience.
Regards,
HDA.
Preview Diff
1 | === modified file 'bin/osv/orm.py' |
2 | --- bin/osv/orm.py 2009-10-22 14:02:51 +0000 |
3 | +++ bin/osv/orm.py 2009-10-28 21:05:25 +0000 |
4 | @@ -50,16 +50,15 @@ |
5 | |
6 | import fields |
7 | import tools |
8 | +from tools.translate import _ |
9 | |
10 | import sys |
11 | |
12 | try: |
13 | - from xml import dom, xpath |
14 | + from lxml import etree |
15 | except ImportError: |
16 | - sys.stderr.write("ERROR: Import xpath module\n") |
17 | - sys.stderr.write("ERROR: Try to install the old python-xml package\n") |
18 | - sys.stderr.write('On Ubuntu Jaunty, try this: sudo cp /usr/lib/python2.6/dist-packages/oldxml/_xmlplus/utils/boolean.so /usr/lib/python2.5/site-packages/oldxml/_xmlplus/utils\n') |
19 | - raise |
20 | + sys.stderr.write("ERROR: Import lxml module\n") |
21 | + sys.stderr.write("ERROR: Try to install the python-lxml package\n") |
22 | |
23 | from tools.config import config |
24 | |
25 | @@ -997,14 +996,14 @@ |
26 | fields = {} |
27 | childs = True |
28 | |
29 | - if node.nodeType == node.ELEMENT_NODE and node.localName == 'field': |
30 | - if node.hasAttribute('name'): |
31 | + if node.tag == 'field': |
32 | + if node.get('name'): |
33 | attrs = {} |
34 | try: |
35 | - if node.getAttribute('name') in self._columns: |
36 | - column = self._columns[node.getAttribute('name')] |
37 | + if node.get('name') in self._columns: |
38 | + column = self._columns[node.get('name')] |
39 | else: |
40 | - column = self._inherit_fields[node.getAttribute('name')][2] |
41 | + column = self._inherit_fields[node.get('name')][2] |
42 | except: |
43 | column = False |
44 | |
45 | @@ -1012,65 +1011,63 @@ |
46 | relation = column._obj |
47 | childs = False |
48 | views = {} |
49 | - for f in node.childNodes: |
50 | - if f.nodeType == f.ELEMENT_NODE and f.localName in ('form', 'tree', 'graph'): |
51 | - node.removeChild(f) |
52 | + for f in node: |
53 | + if f.tag in ('form', 'tree', 'graph'): |
54 | + node.remove(f) |
55 | ctx = context.copy() |
56 | ctx['base_model_name'] = self._name |
57 | xarch, xfields = self.pool.get(relation).__view_look_dom_arch(cr, user, f, view_id, ctx) |
58 | - views[str(f.localName)] = { |
59 | + views[str(f.tag)] = { |
60 | 'arch': xarch, |
61 | 'fields': xfields |
62 | } |
63 | attrs = {'views': views} |
64 | - if node.hasAttribute('widget') and node.getAttribute('widget')=='selection': |
65 | + if node.get('widget') and node.get('widget') == 'selection': |
66 | # We can not use the 'string' domain has it is defined according to the record ! |
67 | - dom = [] |
68 | + dom = None |
69 | if column._domain and not isinstance(column._domain, (str, unicode)): |
70 | dom = column._domain |
71 | - |
72 | attrs['selection'] = self.pool.get(relation).name_search(cr, user, '', dom, context=context) |
73 | - if (node.hasAttribute('required') and not int(node.getAttribute('required'))) or not column.required: |
74 | + if (node.get('required') and not int(node.get('required'))) or not column.required: |
75 | attrs['selection'].append((False,'')) |
76 | - fields[node.getAttribute('name')] = attrs |
77 | + fields[node.get('name')] = attrs |
78 | |
79 | - elif node.nodeType==node.ELEMENT_NODE and node.localName in ('form', 'tree'): |
80 | - result = self.view_header_get(cr, user, False, node.localName, context) |
81 | + elif node.tag in ('form', 'tree'): |
82 | + result = self.view_header_get(cr, user, False, node.tag, context) |
83 | if result: |
84 | - node.setAttribute('string', result) |
85 | + node.set('string', result) |
86 | |
87 | - elif node.nodeType==node.ELEMENT_NODE and node.localName == 'calendar': |
88 | + elif node.tag == 'calendar': |
89 | for additional_field in ('date_start', 'date_delay', 'date_stop', 'color'): |
90 | - if node.hasAttribute(additional_field) and node.getAttribute(additional_field): |
91 | - fields[node.getAttribute(additional_field)] = {} |
92 | + if node.get(additional_field): |
93 | + fields[node.get(additional_field)] = {} |
94 | |
95 | - if node.nodeType == node.ELEMENT_NODE and node.hasAttribute('groups'): |
96 | - if node.getAttribute('groups'): |
97 | - groups = node.getAttribute('groups').split(',') |
98 | + if 'groups' in node.attrib: |
99 | + if node.get('groups'): |
100 | + groups = node.get('groups').split(',') |
101 | readonly = False |
102 | access_pool = self.pool.get('ir.model.access') |
103 | for group in groups: |
104 | readonly = readonly or access_pool.check_groups(cr, user, group) |
105 | if not readonly: |
106 | - node.setAttribute('invisible', '1') |
107 | - node.removeAttribute('groups') |
108 | + node.set('invisible', '1') |
109 | + del(node.attrib['groups']) |
110 | |
111 | - if node.nodeType == node.ELEMENT_NODE: |
112 | - # translate view |
113 | - if ('lang' in context) and not result: |
114 | - if node.hasAttribute('string') and node.getAttribute('string'): |
115 | - trans = self.pool.get('ir.translation')._get_source(cr, user, self._name, 'view', context['lang'], node.getAttribute('string').encode('utf8')) |
116 | - if not trans and ('base_model_name' in context): |
117 | - trans = self.pool.get('ir.translation')._get_source(cr, user, context['base_model_name'], 'view', context['lang'], node.getAttribute('string').encode('utf8')) |
118 | - if trans: |
119 | - node.setAttribute('string', trans) |
120 | - if node.hasAttribute('sum') and node.getAttribute('sum'): |
121 | - trans = self.pool.get('ir.translation')._get_source(cr, user, self._name, 'view', context['lang'], node.getAttribute('sum').encode('utf8')) |
122 | - if trans: |
123 | - node.setAttribute('sum', trans) |
124 | + # translate view |
125 | + if ('lang' in context) and not result: |
126 | + if node.get('string'): |
127 | + trans = self.pool.get('ir.translation')._get_source(cr, user, self._name, 'view', context['lang'], node.get('string').encode('utf8')) |
128 | + if not trans and ('base_model_name' in context): |
129 | + trans = self.pool.get('ir.translation')._get_source(cr, user, context['base_model_name'], 'view', context['lang'], node.get('string').encode('utf8')) |
130 | + if trans: |
131 | + node.set('string', trans) |
132 | + if node.get('sum'): |
133 | + trans = self.pool.get('ir.translation')._get_source(cr, user, self._name, 'view', context['lang'], node.get('sum').encode('utf8')) |
134 | + if trans: |
135 | + node.set('sum', trans) |
136 | |
137 | if childs: |
138 | - for f in node.childNodes: |
139 | + for f in node: |
140 | fields.update(self.__view_look_dom(cr, user, f, view_id, context)) |
141 | |
142 | return fields |
143 | @@ -1081,7 +1078,7 @@ |
144 | rolesobj = self.pool.get('res.roles') |
145 | usersobj = self.pool.get('res.users') |
146 | |
147 | - buttons = (n for n in node.getElementsByTagName('button') if n.getAttribute('type') != 'object') |
148 | + buttons = (n for n in node.getiterator('button') if n.get('type') != 'object') |
149 | for button in buttons: |
150 | can_click = True |
151 | if user != 1: # admin user has all roles |
152 | @@ -1093,7 +1090,7 @@ |
153 | INNER JOIN wkf_transition t ON (t.act_to = a.id) |
154 | WHERE wkf.osv = %s |
155 | AND t.signal = %s |
156 | - """, (self._name, button.getAttribute('name'),)) |
157 | + """, (self._name, button.get('name'),)) |
158 | roles = cr.fetchall() |
159 | |
160 | # draft -> valid = signal_next (role X) |
161 | @@ -1104,8 +1101,6 @@ |
162 | # |
163 | # running -> done = signal_next (role Z) |
164 | # running -> cancel = signal_cancel (role Z) |
165 | - |
166 | - |
167 | # As we don't know the object state, in this scenario, |
168 | # the button "signal_cancel" will be always shown as there is no restriction to cancel in draft |
169 | # the button "signal_next" will be show if the user has any of the roles (X Y or Z) |
170 | @@ -1113,9 +1108,9 @@ |
171 | if roles: |
172 | can_click = any((not role) or rolesobj.check(cr, user, user_roles, role) for (role,) in roles) |
173 | |
174 | - button.setAttribute('readonly', str(int(not can_click))) |
175 | + button.set('readonly', str(int(not can_click))) |
176 | |
177 | - arch = node.toxml(encoding="utf-8").replace('\t', '') |
178 | + arch = etree.tostring(node, encoding="utf-8").replace('\t', '') |
179 | fields = self.fields_get(cr, user, fields_def.keys(), context) |
180 | for field in fields_def: |
181 | if field == 'id': |
182 | @@ -1180,70 +1175,64 @@ |
183 | |
184 | def _inherit_apply(src, inherit): |
185 | def _find(node, node2): |
186 | - if node2.nodeType == node2.ELEMENT_NODE and node2.localName == 'xpath': |
187 | - res = xpath.Evaluate(node2.getAttribute('expr'), node) |
188 | + if node2.tag == 'xpath': |
189 | + res = node.xpath(node2.get('expr')) |
190 | return res and res[0] |
191 | else: |
192 | - if node.nodeType == node.ELEMENT_NODE and node.localName == node2.localName: |
193 | + for n in node.getiterator(node2.tag): |
194 | res = True |
195 | - for attr in node2.attributes.keys(): |
196 | + for attr in node2.attrib: |
197 | if attr == 'position': |
198 | continue |
199 | - if node.hasAttribute(attr): |
200 | - if node.getAttribute(attr)==node2.getAttribute(attr): |
201 | + if n.get(attr): |
202 | + if n.get(attr) == node2.get(attr): |
203 | continue |
204 | res = False |
205 | if res: |
206 | - return node |
207 | - for child in node.childNodes: |
208 | - res = _find(child, node2) |
209 | - if res: |
210 | - return res |
211 | + return n |
212 | return None |
213 | - |
214 | - |
215 | - doc_src = dom.minidom.parseString(encode(src)) |
216 | - doc_dest = dom.minidom.parseString(encode(inherit)) |
217 | - toparse = doc_dest.childNodes |
218 | + # End: _find(node, node2) |
219 | + |
220 | + doc_dest = etree.fromstring(encode(inherit)) |
221 | + toparse = [ doc_dest ] |
222 | while len(toparse): |
223 | node2 = toparse.pop(0) |
224 | - if not node2.nodeType == node2.ELEMENT_NODE: |
225 | - continue |
226 | - if node2.localName == 'data': |
227 | - toparse += node2.childNodes |
228 | - continue |
229 | - node = _find(doc_src, node2) |
230 | - if node: |
231 | + if node2.tag == 'data': |
232 | + toparse += [ c for c in doc_dest ] |
233 | + continue |
234 | + node = _find(src, node2) |
235 | + if node is not None: |
236 | pos = 'inside' |
237 | - if node2.hasAttribute('position'): |
238 | - pos = node2.getAttribute('position') |
239 | + if node2.get('position'): |
240 | + pos = node2.get('position') |
241 | if pos == 'replace': |
242 | - parent = node.parentNode |
243 | - for child in node2.childNodes: |
244 | - if child.nodeType == child.ELEMENT_NODE: |
245 | - parent.insertBefore(child, node) |
246 | - parent.removeChild(node) |
247 | + for child in node2: |
248 | + node.addprevious(child) |
249 | + node.getparent().remove(node) |
250 | else: |
251 | - sib = node.nextSibling |
252 | - for child in node2.childNodes: |
253 | - if child.nodeType == child.ELEMENT_NODE: |
254 | - if pos == 'inside': |
255 | - node.appendChild(child) |
256 | - elif pos == 'after': |
257 | - node.parentNode.insertBefore(child, sib) |
258 | - elif pos=='before': |
259 | - node.parentNode.insertBefore(child, node) |
260 | + sib = node.getnext() |
261 | + for child in node2: |
262 | + if pos == 'inside': |
263 | + node.append(child) |
264 | + elif pos == 'after': |
265 | + if sib is None: |
266 | + node.addnext(child) |
267 | else: |
268 | - raise AttributeError(_('Unknown position in inherited view %s !') % pos) |
269 | + sib.addprevious(child) |
270 | + elif pos == 'before': |
271 | + node.addprevious(child) |
272 | + else: |
273 | + raise AttributeError(_('Unknown position in inherited view %s !') % pos) |
274 | else: |
275 | attrs = ''.join([ |
276 | - ' %s="%s"' % (attr, node2.getAttribute(attr)) |
277 | - for attr in node2.attributes.keys() |
278 | + ' %s="%s"' % (attr, node2.get(attr)) |
279 | + for attr in node2.attrib |
280 | if attr != 'position' |
281 | ]) |
282 | - tag = "<%s%s>" % (node2.localName, attrs) |
283 | + tag = "<%s%s>" % (node2.tag, attrs) |
284 | raise AttributeError(_("Couldn't find tag '%s' in parent view !") % tag) |
285 | - return doc_src.toxml(encoding="utf-8").replace('\t', '') |
286 | + return src |
287 | + # End: _inherit_apply(src, inherit) |
288 | |
289 | result = {'type': view_type, 'model': self._name} |
290 | |
291 | @@ -1295,7 +1284,8 @@ |
292 | result = _inherit_apply_rec(result, id) |
293 | return result |
294 | |
295 | - result['arch'] = _inherit_apply_rec(result['arch'], sql_res[3]) |
296 | + inherit_result = etree.fromstring(encode(result['arch'])) |
297 | + result['arch'] = _inherit_apply_rec(inherit_result, sql_res[3]) |
298 | |
299 | result['name'] = sql_res[1] |
300 | result['field_parent'] = sql_res[2] or False |
301 | @@ -1322,13 +1312,12 @@ |
302 | xml = self.__get_default_calendar_view() |
303 | else: |
304 | xml = '' |
305 | - result['arch'] = xml |
306 | + result['arch'] = etree.fromstring(xml) |
307 | result['name'] = 'default' |
308 | result['field_parent'] = False |
309 | result['view_id'] = 0 |
310 | |
311 | - doc = dom.minidom.parseString(encode(result['arch'])) |
312 | - xarch, xfields = self.__view_look_dom_arch(cr, user, doc, view_id, context=context) |
313 | + xarch, xfields = self.__view_look_dom_arch(cr, user, result['arch'], view_id, context=context) |
314 | result['arch'] = xarch |
315 | result['fields'] = xfields |
316 | if toolbar: |
317 | @@ -3044,7 +3033,3 @@ |
318 | if i in ids: |
319 | return False |
320 | return True |
321 | - |
322 | - |
323 | -# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4: |
324 | - |
325 | |
326 | === modified file 'bin/tools/convert.py' |
327 | --- bin/tools/convert.py 2009-08-06 08:37:19 +0000 |
328 | +++ bin/tools/convert.py 2009-10-28 21:05:25 +0000 |
329 | @@ -21,7 +21,7 @@ |
330 | ############################################################################## |
331 | import re |
332 | import cStringIO |
333 | -import xml.dom.minidom |
334 | +from lxml import etree |
335 | import osv |
336 | import ir |
337 | import pooler |
338 | @@ -62,16 +62,13 @@ |
339 | def _eval_xml(self,node, pool, cr, uid, idref, context=None): |
340 | if context is None: |
341 | context = {} |
342 | - if node.nodeType == node.TEXT_NODE: |
343 | - return node.data.encode("utf8") |
344 | - elif node.nodeType == node.ELEMENT_NODE: |
345 | - if node.nodeName in ('field','value'): |
346 | - t = node.getAttribute('type') or 'char' |
347 | - f_model = node.getAttribute("model").encode('ascii') |
348 | - if len(node.getAttribute('search')): |
349 | - f_search = node.getAttribute("search").encode('utf-8') |
350 | - f_use = node.getAttribute("use").encode('ascii') |
351 | - f_name = node.getAttribute("name").encode('utf-8') |
352 | + if node.tag in ('field','value'): |
353 | + t = node.get('type','') or 'char' |
354 | + f_model = node.get("model", '').encode('ascii') |
355 | + if len(node.get('search','')): |
356 | + f_search = node.get("search",'').encode('utf-8') |
357 | + f_use = node.get("use",'').encode('ascii') |
358 | + f_name = node.get("name",'').encode('utf-8') |
359 | if len(f_use)==0: |
360 | f_use = "id" |
361 | q = eval(f_search, idref) |
362 | @@ -87,7 +84,7 @@ |
363 | if isinstance(f_val, tuple): |
364 | f_val = f_val[0] |
365 | return f_val |
366 | - a_eval = node.getAttribute('eval') |
367 | + a_eval = node.get('eval','') |
368 | if len(a_eval): |
369 | import time |
370 | from mx import DateTime |
371 | @@ -116,14 +113,11 @@ |
372 | if not id in idref: |
373 | idref[id]=self.id_get(cr, False, id) |
374 | return s % idref |
375 | - txt = '<?xml version="1.0"?>\n'+_process("".join([i.toxml().encode("utf8") for i in node.childNodes]), idref) |
376 | -# txt = '<?xml version="1.0"?>\n'+"".join([i.toxml().encode("utf8") for i in node.childNodes]) % idref |
377 | - |
378 | + txt = '<?xml version="1.0"?>\n'+_process("".join([etree.tostring(i).encode("utf8") for i in node.getchildren()]), idref) |
379 | return txt |
380 | if t in ('char', 'int', 'float'): |
381 | d = "" |
382 | - for n in [i for i in node.childNodes]: |
383 | - d+=str(_eval_xml(self,n,pool,cr,uid,idref)) |
384 | + d = node.text |
385 | if t == 'int': |
386 | d = d.strip() |
387 | if d=='None': |
388 | @@ -135,37 +129,37 @@ |
389 | return d |
390 | elif t in ('list','tuple'): |
391 | res=[] |
392 | - for n in [i for i in node.childNodes if (i.nodeType == i.ELEMENT_NODE and i.nodeName=='value')]: |
393 | + for n in [i for i in node.getchildren() if (i.tag=='value')]: |
394 | res.append(_eval_xml(self,n,pool,cr,uid,idref)) |
395 | if t=='tuple': |
396 | return tuple(res) |
397 | return res |
398 | - elif node.nodeName=="getitem": |
399 | - for n in [i for i in node.childNodes if (i.nodeType == i.ELEMENT_NODE)]: |
400 | - res=_eval_xml(self,n,pool,cr,uid,idref) |
401 | - if not res: |
402 | - raise LookupError |
403 | - elif node.getAttribute('type') in ("int", "list"): |
404 | - return res[int(node.getAttribute('index'))] |
405 | - else: |
406 | - return res[node.getAttribute('index').encode("utf8")] |
407 | - elif node.nodeName=="function": |
408 | - args = [] |
409 | - a_eval = node.getAttribute('eval') |
410 | - if len(a_eval): |
411 | - idref['ref'] = lambda x: self.id_get(cr, False, x) |
412 | - args = eval(a_eval, idref) |
413 | - for n in [i for i in node.childNodes if (i.nodeType == i.ELEMENT_NODE)]: |
414 | - args.append(_eval_xml(self,n, pool, cr, uid, idref, context)) |
415 | - model = pool.get(node.getAttribute('model')) |
416 | - method = node.getAttribute('name') |
417 | - res = getattr(model, method)(cr, uid, *args) |
418 | - return res |
419 | - elif node.nodeName=="test": |
420 | - d = "" |
421 | - for n in [i for i in node.childNodes]: |
422 | - d+=str(_eval_xml(self,n,pool,cr,uid,idref, context=context)) |
423 | - return d |
424 | + elif node.tag == "getitem": |
425 | + for n in [i for i in node.getchildren()]: |
426 | + res=_eval_xml(self,n,pool,cr,uid,idref) |
427 | + if not res: |
428 | + raise LookupError |
429 | + elif node.get('type','') in ("int", "list"): |
430 | + return res[int(node.get('index',''))] |
431 | + else: |
432 | + return res[node.get('index','').encode("utf8")] |
433 | + elif node.tag == "function": |
434 | + args = [] |
435 | + a_eval = node.get('eval','') |
436 | + if len(a_eval): |
437 | + idref['ref'] = lambda x: self.id_get(cr, False, x) |
438 | + args = eval(a_eval, idref) |
439 | + for n in [i for i in node.getchildren()]: |
440 | + return_val = _eval_xml(self,n, pool, cr, uid, idref, context) |
441 | + if return_val != None: |
442 | + args.append(return_val) |
443 | + model = pool.get(node.get('model','')) |
444 | + method = node.get('name','') |
445 | + res = getattr(model, method)(cr, uid, *args) |
446 | + return res |
447 | + elif node.tag == "test": |
448 | + d = node.text |
449 | + return d |
450 | |
451 | |
452 | escape_re = re.compile(r'(?<!\\)/') |
453 | @@ -205,31 +199,31 @@ |
454 | |
455 | @staticmethod |
456 | def nodeattr2bool(node, attr, default=False): |
457 | - if not node.hasAttribute(attr): |
458 | + if not node.get(attr): |
459 | return default |
460 | - val = node.getAttribute(attr).strip() |
461 | + val = node.get(attr).strip() |
462 | if not val: |
463 | return default |
464 | return val.lower() not in ('0', 'false', 'off') |
465 | - |
466 | + |
467 | def isnoupdate(self, data_node=None): |
468 | - return self.noupdate or (data_node and self.nodeattr2bool(data_node, 'noupdate', False)) |
469 | + return self.noupdate or (len(data_node) and self.nodeattr2bool(data_node, 'noupdate', False)) |
470 | |
471 | def get_context(self, data_node, node, eval_dict): |
472 | - data_node_context = (data_node and data_node.getAttribute('context').encode('utf8')) |
473 | + data_node_context = (len(data_node) and data_node.get('context','').encode('utf8')) |
474 | if data_node_context: |
475 | context = eval(data_node_context, eval_dict) |
476 | else: |
477 | context = {} |
478 | |
479 | - node_context = node.getAttribute("context").encode('utf8') |
480 | + node_context = node.get("context",'').encode('utf8') |
481 | if len(node_context): |
482 | context.update(eval(node_context, eval_dict)) |
483 | |
484 | return context |
485 | |
486 | def get_uid(self, cr, uid, data_node, node): |
487 | - node_uid = node.getAttribute('uid') or (data_node and data_node.getAttribute('uid')) |
488 | + node_uid = node.get('uid','') or (len(data_node) and data_node.get('uid','')) |
489 | if len(node_uid): |
490 | return self.id_get(cr, None, node_uid) |
491 | return uid |
492 | @@ -249,9 +243,9 @@ |
493 | self.logger.notifyChannel('init', netsvc.LOG_ERROR, 'id: %s is to long (max: 64)'% (id,)) |
494 | |
495 | def _tag_delete(self, cr, rec, data_node=None): |
496 | - d_model = rec.getAttribute("model") |
497 | - d_search = rec.getAttribute("search") |
498 | - d_id = rec.getAttribute("id") |
499 | + d_model = rec.get("model",'') |
500 | + d_search = rec.get("search",'') |
501 | + d_id = rec.get("id",'') |
502 | ids = [] |
503 | if len(d_search): |
504 | ids = self.pool.get(d_model).search(cr,self.uid,eval(d_search)) |
505 | @@ -269,24 +263,24 @@ |
506 | def _tag_report(self, cr, rec, data_node=None): |
507 | res = {} |
508 | for dest,f in (('name','string'),('model','model'),('report_name','name')): |
509 | - res[dest] = rec.getAttribute(f).encode('utf8') |
510 | + res[dest] = rec.get(f,'').encode('utf8') |
511 | assert res[dest], "Attribute %s of report is empty !" % (f,) |
512 | for field,dest in (('rml','report_rml'),('xml','report_xml'),('xsl','report_xsl'),('attachment','attachment'),('attachment_use','attachment_use')): |
513 | - if rec.hasAttribute(field): |
514 | - res[dest] = rec.getAttribute(field).encode('utf8') |
515 | - if rec.hasAttribute('auto'): |
516 | - res['auto'] = eval(rec.getAttribute('auto')) |
517 | - if rec.hasAttribute('sxw'): |
518 | - sxw_content = misc.file_open(rec.getAttribute('sxw')).read() |
519 | + if rec.get(field): |
520 | + res[dest] = rec.get(field,'').encode('utf8') |
521 | + if rec.get('auto'): |
522 | + res['auto'] = eval(rec.get('auto','')) |
523 | + if rec.get('sxw'): |
524 | + sxw_content = misc.file_open(rec.get('sxw','')).read() |
525 | res['report_sxw_content'] = sxw_content |
526 | - if rec.hasAttribute('header'): |
527 | - res['header'] = eval(rec.getAttribute('header')) |
528 | - res['multi'] = rec.hasAttribute('multi') and eval(rec.getAttribute('multi')) |
529 | - xml_id = rec.getAttribute('id').encode('utf8') |
530 | + if rec.get('header'): |
531 | + res['header'] = eval(rec.get('header','')) |
532 | + res['multi'] = rec.get('multi','') and eval(rec.get('multi','')) |
533 | + xml_id = rec.get('id','').encode('utf8') |
534 | self._test_xml_id(xml_id) |
535 | |
536 | - if rec.hasAttribute('groups'): |
537 | - g_names = rec.getAttribute('groups').split(',') |
538 | + if rec.get('groups'): |
539 | + g_names = rec.get('groups','').split(',') |
540 | groups_value = [] |
541 | groups_obj = self.pool.get('res.groups') |
542 | for group in g_names: |
543 | @@ -300,13 +294,13 @@ |
544 | |
545 | id = self.pool.get('ir.model.data')._update(cr, self.uid, "ir.actions.report.xml", self.module, res, xml_id, noupdate=self.isnoupdate(data_node), mode=self.mode) |
546 | self.idref[xml_id] = int(id) |
547 | - |
548 | - |
549 | - if not rec.hasAttribute('menu') or eval(rec.getAttribute('menu')): |
550 | - keyword = str(rec.getAttribute('keyword') or 'client_print_multi') |
551 | + |
552 | + |
553 | + if not rec.get('menu') or eval(rec.get('menu','')): |
554 | + keyword = str(rec.get('keyword','') or 'client_print_multi') |
555 | keys = [('action',keyword),('res_model',res['model'])] |
556 | value = 'ir.actions.report.xml,'+str(id) |
557 | - replace = rec.hasAttribute('replace') and rec.getAttribute("replace") or True |
558 | + replace = rec.get("replace",'') or True |
559 | self.pool.get('ir.model.data').ir_set(cr, self.uid, 'action', keyword, res['name'], [res['model']], value, replace=replace, isobject=True, xml_id=xml_id) |
560 | return False |
561 | |
562 | @@ -319,16 +313,16 @@ |
563 | return False |
564 | |
565 | def _tag_wizard(self, cr, rec, data_node=None): |
566 | - string = rec.getAttribute("string").encode('utf8') |
567 | - model = rec.getAttribute("model").encode('utf8') |
568 | - name = rec.getAttribute("name").encode('utf8') |
569 | - xml_id = rec.getAttribute('id').encode('utf8') |
570 | + string = rec.get("string",'').encode('utf8') |
571 | + model = rec.get("model",'').encode('utf8') |
572 | + name = rec.get("name",'').encode('utf8') |
573 | + xml_id = rec.get('id','').encode('utf8') |
574 | self._test_xml_id(xml_id) |
575 | - multi = rec.hasAttribute('multi') and eval(rec.getAttribute('multi')) |
576 | + multi = rec.get('multi','') and eval(rec.get('multi','')) |
577 | res = {'name': string, 'wiz_name': name, 'multi': multi, 'model': model} |
578 | |
579 | - if rec.hasAttribute('groups'): |
580 | - g_names = rec.getAttribute('groups').split(',') |
581 | + if rec.get('groups'): |
582 | + g_names = rec.get('groups','').split(',') |
583 | groups_value = [] |
584 | groups_obj = self.pool.get('res.groups') |
585 | for group in g_names: |
586 | @@ -343,20 +337,19 @@ |
587 | id = self.pool.get('ir.model.data')._update(cr, self.uid, "ir.actions.wizard", self.module, res, xml_id, noupdate=self.isnoupdate(data_node), mode=self.mode) |
588 | self.idref[xml_id] = int(id) |
589 | # ir_set |
590 | - if (not rec.hasAttribute('menu') or eval(rec.getAttribute('menu'))) and id: |
591 | - keyword = str(rec.getAttribute('keyword') or 'client_action_multi') |
592 | + if (not rec.get('menu') or eval(rec.get('menu',''))) and id: |
593 | + keyword = str(rec.get('keyword','') or 'client_action_multi') |
594 | keys = [('action',keyword),('res_model',model)] |
595 | value = 'ir.actions.wizard,'+str(id) |
596 | - replace = rec.hasAttribute('replace') and \ |
597 | - rec.getAttribute("replace") or True |
598 | + replace = rec.get("replace",'') or True |
599 | self.pool.get('ir.model.data').ir_set(cr, self.uid, 'action', keyword, string, [model], value, replace=replace, isobject=True, xml_id=xml_id) |
600 | return False |
601 | |
602 | def _tag_url(self, cr, rec, data_node=None): |
603 | - url = rec.getAttribute("string").encode('utf8') |
604 | - target = rec.getAttribute("target").encode('utf8') |
605 | - name = rec.getAttribute("name").encode('utf8') |
606 | - xml_id = rec.getAttribute('id').encode('utf8') |
607 | + url = rec.get("string",'').encode('utf8') |
608 | + target = rec.get("target",'').encode('utf8') |
609 | + name = rec.get("name",'').encode('utf8') |
610 | + xml_id = rec.get('id','').encode('utf8') |
611 | self._test_xml_id(xml_id) |
612 | |
613 | res = {'name': name, 'url': url, 'target':target} |
614 | @@ -364,34 +357,31 @@ |
615 | id = self.pool.get('ir.model.data')._update(cr, self.uid, "ir.actions.url", self.module, res, xml_id, noupdate=self.isnoupdate(data_node), mode=self.mode) |
616 | self.idref[xml_id] = int(id) |
617 | # ir_set |
618 | - if (not rec.hasAttribute('menu') or eval(rec.getAttribute('menu'))) and id: |
619 | - keyword = str(rec.getAttribute('keyword') or 'client_action_multi') |
620 | + if (not rec.get('menu') or eval(rec.get('menu',''))) and id: |
621 | + keyword = str(rec.get('keyword','') or 'client_action_multi') |
622 | keys = [('action',keyword)] |
623 | value = 'ir.actions.url,'+str(id) |
624 | - replace = rec.hasAttribute('replace') and \ |
625 | - rec.getAttribute("replace") or True |
626 | + replace = rec.get("replace",'') or True |
627 | self.pool.get('ir.model.data').ir_set(cr, self.uid, 'action', keyword, url, ["ir.actions.url"], value, replace=replace, isobject=True, xml_id=xml_id) |
628 | return False |
629 | |
630 | def _tag_act_window(self, cr, rec, data_node=None): |
631 | - name = rec.hasAttribute('name') and rec.getAttribute('name').encode('utf-8') |
632 | - xml_id = rec.getAttribute('id').encode('utf8') |
633 | + name = rec.get('name','').encode('utf-8') |
634 | + xml_id = rec.get('id','').encode('utf8') |
635 | self._test_xml_id(xml_id) |
636 | - type = rec.hasAttribute('type') and rec.getAttribute('type').encode('utf-8') or 'ir.actions.act_window' |
637 | + type = rec.get('type','').encode('utf-8') or 'ir.actions.act_window' |
638 | view_id = False |
639 | - if rec.hasAttribute('view'): |
640 | - view_id = self.id_get(cr, 'ir.actions.act_window', rec.getAttribute('view').encode('utf-8')) |
641 | - domain = rec.hasAttribute('domain') and rec.getAttribute('domain').encode('utf-8') |
642 | - context = rec.hasAttribute('context') and rec.getAttribute('context').encode('utf-8') or '{}' |
643 | - res_model = rec.getAttribute('res_model').encode('utf-8') |
644 | - src_model = rec.hasAttribute('src_model') and rec.getAttribute('src_model').encode('utf-8') |
645 | - view_type = rec.hasAttribute('view_type') and rec.getAttribute('view_type').encode('utf-8') or 'form' |
646 | - view_mode = rec.hasAttribute('view_mode') and rec.getAttribute('view_mode').encode('utf-8') or 'tree,form' |
647 | - usage = rec.hasAttribute('usage') and rec.getAttribute('usage').encode('utf-8') |
648 | - limit = rec.hasAttribute('limit') and rec.getAttribute('limit').encode('utf-8') |
649 | - auto_refresh = rec.hasAttribute('auto_refresh') \ |
650 | - and rec.getAttribute('auto_refresh').encode('utf-8') |
651 | -# groups_id = rec.hasAttribute('groups') and rec.getAttribute('groups').encode('utf-8') |
652 | + if rec.get('view'): |
653 | + view_id = self.id_get(cr, 'ir.actions.act_window', rec.get('view','').encode('utf-8')) |
654 | + domain = rec.get('domain','').encode('utf-8') |
655 | + context = rec.get('context','').encode('utf-8') or '{}' |
656 | + res_model = rec.get('res_model','').encode('utf-8') |
657 | + src_model = rec.get('src_model','').encode('utf-8') |
658 | + view_type = rec.get('view_type','').encode('utf-8') or 'form' |
659 | + view_mode = rec.get('view_mode','').encode('utf-8') or 'tree,form' |
660 | + usage = rec.get('usage','').encode('utf-8') |
661 | + limit = rec.get('limit','').encode('utf-8') |
662 | + auto_refresh = rec.get('auto_refresh','').encode('utf-8') |
663 | |
664 | # def ref() added because , if context has ref('id') eval wil use this ref |
665 | |
666 | @@ -417,8 +407,8 @@ |
667 | # 'groups_id':groups_id, |
668 | } |
669 | |
670 | - if rec.hasAttribute('groups'): |
671 | - g_names = rec.getAttribute('groups').split(',') |
672 | + if rec.get('groups'): |
673 | + g_names = rec.get('groups','').split(',') |
674 | groups_value = [] |
675 | groups_obj = self.pool.get('res.groups') |
676 | for group in g_names: |
677 | @@ -430,8 +420,8 @@ |
678 | groups_value.append((4, group_id)) |
679 | res['groups_id'] = groups_value |
680 | |
681 | - if rec.hasAttribute('target'): |
682 | - res['target'] = rec.getAttribute('target') |
683 | + if rec.get('target'): |
684 | + res['target'] = rec.get('target','') |
685 | id = self.pool.get('ir.model.data')._update(cr, self.uid, 'ir.actions.act_window', self.module, res, xml_id, noupdate=self.isnoupdate(data_node), mode=self.mode) |
686 | self.idref[xml_id] = int(id) |
687 | |
688 | @@ -439,7 +429,7 @@ |
689 | keyword = 'client_action_relate' |
690 | keys = [('action', keyword), ('res_model', res_model)] |
691 | value = 'ir.actions.act_window,'+str(id) |
692 | - replace = rec.hasAttribute('replace') and rec.getAttribute('replace') or True |
693 | + replace = rec.get('replace','') or True |
694 | self.pool.get('ir.model.data').ir_set(cr, self.uid, 'action', keyword, xml_id, [src_model], value, replace=replace, isobject=True, xml_id=xml_id) |
695 | # TODO add remove ir.model.data |
696 | return False |
697 | @@ -448,8 +438,8 @@ |
698 | if not self.mode=='init': |
699 | return False |
700 | res = {} |
701 | - for field in [i for i in rec.childNodes if (i.nodeType == i.ELEMENT_NODE and i.nodeName=="field")]: |
702 | - f_name = field.getAttribute("name").encode('utf-8') |
703 | + for field in [i for i in rec.getchildren() if (i.tag=="field")]: |
704 | + f_name = field.get("name",'').encode('utf-8') |
705 | f_val = _eval_xml(self,field,self.pool, cr, self.uid, self.idref) |
706 | res[f_name] = f_val |
707 | self.pool.get('ir.model.data').ir_set(cr, self.uid, res['key'], res['key2'], res['name'], res['models'], res['value'], replace=res.get('replace',True), isobject=res.get('isobject', False), meta=res.get('meta',None)) |
708 | @@ -458,21 +448,21 @@ |
709 | def _tag_workflow(self, cr, rec, data_node=None): |
710 | if self.isnoupdate(data_node) and self.mode != 'init': |
711 | return |
712 | - model = str(rec.getAttribute('model')) |
713 | - w_ref = rec.getAttribute('ref') |
714 | + model = str(rec.get('model','')) |
715 | + w_ref = rec.get('ref','') |
716 | if len(w_ref): |
717 | id = self.id_get(cr, model, w_ref) |
718 | else: |
719 | - assert rec.childNodes, 'You must define a child node if you dont give a ref' |
720 | - element_childs = [i for i in rec.childNodes if i.nodeType == i.ELEMENT_NODE] |
721 | - assert len(element_childs) == 1, 'Only one child node is accepted (%d given)' % len(rec.childNodes) |
722 | + assert rec.getchildren(), 'You must define a child node if you dont give a ref' |
723 | + element_childs = [i for i in rec.getchildren()] |
724 | + assert len(element_childs) == 1, 'Only one child node is accepted (%d given)' % len(rec.getchildren()) |
725 | id = _eval_xml(self, element_childs[0], self.pool, cr, self.uid, self.idref) |
726 | |
727 | uid = self.get_uid(cr, self.uid, data_node, rec) |
728 | wf_service = netsvc.LocalService("workflow") |
729 | wf_service.trg_validate(uid, model, |
730 | id, |
731 | - str(rec.getAttribute('action')), cr) |
732 | + str(rec.get('action','')), cr) |
733 | return False |
734 | |
735 | # |
736 | @@ -483,12 +473,12 @@ |
737 | # parent="parent_id" |
738 | # |
739 | def _tag_menuitem(self, cr, rec, data_node=None): |
740 | - rec_id = rec.getAttribute("id").encode('ascii') |
741 | + rec_id = rec.get("id",'').encode('ascii') |
742 | self._test_xml_id(rec_id) |
743 | - m_l = map(escape, escape_re.split(rec.getAttribute("name").encode('utf8'))) |
744 | + m_l = map(escape, escape_re.split(rec.get("name",'').encode('utf8'))) |
745 | |
746 | values = {'parent_id': False} |
747 | - if not rec.hasAttribute('parent'): |
748 | + if not rec.get('parent'): |
749 | pid = False |
750 | for idx, menu_elem in enumerate(m_l): |
751 | if pid: |
752 | @@ -500,7 +490,7 @@ |
753 | values = {'parent_id': pid,'name':menu_elem} |
754 | elif res: |
755 | pid = res[0] |
756 | - xml_id = idx==len(m_l)-1 and rec.getAttribute('id').encode('utf8') |
757 | + xml_id = idx==len(m_l)-1 and rec.get('id','').encode('utf8') |
758 | try: |
759 | npid = self.pool.get('ir.model.data')._update_dummy(cr, self.uid, 'ir.ui.menu', self.module, xml_id, idx==len(m_l)-1) |
760 | except: |
761 | @@ -510,18 +500,18 @@ |
762 | self.logger.notifyChannel("init", netsvc.LOG_WARNING, 'Warning no ID for submenu %s of menu %s !' % (menu_elem, str(m_l))) |
763 | pid = self.pool.get('ir.ui.menu').create(cr, self.uid, {'parent_id' : pid, 'name' : menu_elem}) |
764 | else: |
765 | - menu_parent_id = self.id_get(cr, 'ir.ui.menu', rec.getAttribute('parent')) |
766 | + menu_parent_id = self.id_get(cr, 'ir.ui.menu', rec.get('parent','')) |
767 | values = {'parent_id': menu_parent_id} |
768 | - if rec.hasAttribute('name'): |
769 | - values['name'] = rec.getAttribute('name') |
770 | + if rec.get('name'): |
771 | + values['name'] = rec.get('name','') |
772 | try: |
773 | - res = [ self.id_get(cr, 'ir.ui.menu', rec.getAttribute('id')) ] |
774 | + res = [ self.id_get(cr, 'ir.ui.menu', rec.get('id','')) ] |
775 | except: |
776 | res = None |
777 | |
778 | - if rec.hasAttribute('action'): |
779 | - a_action = rec.getAttribute('action').encode('utf8') |
780 | - a_type = rec.getAttribute('type').encode('utf8') or 'act_window' |
781 | + if rec.get('action'): |
782 | + a_action = rec.get('action','').encode('utf8') |
783 | + a_type = rec.get('type','').encode('utf8') or 'act_window' |
784 | icons = { |
785 | "act_window": 'STOCK_NEW', |
786 | "report.xml": 'STOCK_PASTE', |
787 | @@ -560,13 +550,13 @@ |
788 | resw = cr.fetchone() |
789 | if (not values.get('name', False)) and resw: |
790 | values['name'] = resw[0] |
791 | - if rec.hasAttribute('sequence'): |
792 | - values['sequence'] = int(rec.getAttribute('sequence')) |
793 | - if rec.hasAttribute('icon'): |
794 | - values['icon'] = str(rec.getAttribute('icon')) |
795 | + if rec.get('sequence'): |
796 | + values['sequence'] = int(rec.get('sequence','')) |
797 | + if rec.get('icon'): |
798 | + values['icon'] = str(rec.get('icon','')) |
799 | |
800 | - if rec.hasAttribute('groups'): |
801 | - g_names = rec.getAttribute('groups').split(',') |
802 | + if rec.get('groups'): |
803 | + g_names = rec.get('groups','').split(',') |
804 | groups_value = [] |
805 | groups_obj = self.pool.get('res.groups') |
806 | for group in g_names: |
807 | @@ -578,16 +568,16 @@ |
808 | groups_value.append((4, group_id)) |
809 | values['groups_id'] = groups_value |
810 | |
811 | - xml_id = rec.getAttribute('id').encode('utf8') |
812 | + xml_id = rec.get('id','').encode('utf8') |
813 | self._test_xml_id(xml_id) |
814 | pid = self.pool.get('ir.model.data')._update(cr, self.uid, 'ir.ui.menu', self.module, values, xml_id, noupdate=self.isnoupdate(data_node), mode=self.mode, res_id=res and res[0] or False) |
815 | |
816 | if rec_id and pid: |
817 | self.idref[rec_id] = int(pid) |
818 | |
819 | - if rec.hasAttribute('action') and pid: |
820 | - a_action = rec.getAttribute('action').encode('utf8') |
821 | - a_type = rec.getAttribute('type').encode('utf8') or 'act_window' |
822 | + if rec.get('action') and pid: |
823 | + a_action = rec.get('action','').encode('utf8') |
824 | + a_type = rec.get('type','').encode('utf8') or 'act_window' |
825 | a_id = self.id_get(cr, 'ir.actions.%s' % a_type, a_action) |
826 | action = "ir.actions.%s,%d" % (a_type, a_id) |
827 | self.pool.get('ir.model.data').ir_set(cr, self.uid, 'action', 'tree_but_open', 'Menuitem', [('ir.ui.menu', int(pid))], action, True, True, xml_id=rec_id) |
828 | @@ -600,17 +590,17 @@ |
829 | if self.isnoupdate(data_node) and self.mode != 'init': |
830 | return |
831 | |
832 | - rec_model = rec.getAttribute("model").encode('ascii') |
833 | + rec_model = rec.get("model",'').encode('ascii') |
834 | model = self.pool.get(rec_model) |
835 | assert model, "The model %s does not exist !" % (rec_model,) |
836 | - rec_id = rec.getAttribute("id").encode('ascii') |
837 | + rec_id = rec.get("id",'').encode('ascii') |
838 | self._test_xml_id(rec_id) |
839 | - rec_src = rec.getAttribute("search").encode('utf8') |
840 | - rec_src_count = rec.getAttribute("count") |
841 | - |
842 | - severity = rec.getAttribute("severity").encode('ascii') or netsvc.LOG_ERROR |
843 | - |
844 | - rec_string = rec.getAttribute("string").encode('utf8') or 'unknown' |
845 | + rec_src = rec.get("search",'').encode('utf8') |
846 | + rec_src_count = rec.get("count",'') |
847 | + |
848 | + severity = rec.get("severity",'').encode('ascii') or netsvc.LOG_ERROR |
849 | + |
850 | + rec_string = rec.get("string",'').encode('utf8') or 'unknown' |
851 | |
852 | ids = None |
853 | eval_dict = {'ref': _ref(self, cr)} |
854 | @@ -651,8 +641,8 @@ |
855 | globals['floatEqual'] = self._assert_equals |
856 | globals['ref'] = ref |
857 | globals['_ref'] = ref |
858 | - for test in [i for i in rec.childNodes if (i.nodeType == i.ELEMENT_NODE and i.nodeName=="test")]: |
859 | - f_expr = test.getAttribute("expr").encode('utf-8') |
860 | + for test in [i for i in rec.getchildren() if (i.tag=="test")]: |
861 | + f_expr = test.get("expr",'').encode('utf-8') |
862 | expected_value = _eval_xml(self, test, self.pool, cr, uid, self.idref, context=context) or True |
863 | expression_value = eval(f_expr, globals) |
864 | if expression_value != expected_value: # assertion failed |
865 | @@ -661,7 +651,7 @@ |
866 | ' xmltag: %s\n' \ |
867 | ' expected value: %r\n' \ |
868 | ' obtained value: %r\n' \ |
869 | - % (rec_string, test.toxml(), expected_value, expression_value) |
870 | + % (rec_string, etree.tostring(test), expected_value, expression_value) |
871 | self.logger.notifyChannel('init', severity, msg) |
872 | sevval = getattr(logging, severity.upper()) |
873 | if sevval >= config['assert_exit_level']: |
874 | @@ -672,12 +662,11 @@ |
875 | self.assert_report.record_assertion(True, severity) |
876 | |
877 | def _tag_record(self, cr, rec, data_node=None): |
878 | - rec_model = rec.getAttribute("model").encode('ascii') |
879 | + rec_model = rec.get("model").encode('ascii') |
880 | model = self.pool.get(rec_model) |
881 | assert model, "The model %s does not exist !" % (rec_model,) |
882 | - rec_id = rec.getAttribute("id").encode('ascii') |
883 | + rec_id = rec.get("id",'').encode('ascii') |
884 | self._test_xml_id(rec_id) |
885 | - |
886 | if self.isnoupdate(data_node) and self.mode != 'init': |
887 | # check if the xml record has an id string |
888 | if rec_id: |
889 | @@ -699,21 +688,20 @@ |
890 | # we don't want to create it, so we skip it |
891 | return None |
892 | # else, we let the record to be created |
893 | - |
894 | + |
895 | else: |
896 | # otherwise it is skipped |
897 | return None |
898 | - |
899 | res = {} |
900 | - for field in [i for i in rec.childNodes if (i.nodeType == i.ELEMENT_NODE and i.nodeName=="field")]: |
901 | + for field in [i for i in rec.getchildren() if (i.tag == "field")]: |
902 | #TODO: most of this code is duplicated above (in _eval_xml)... |
903 | - f_name = field.getAttribute("name").encode('utf-8') |
904 | - f_ref = field.getAttribute("ref").encode('ascii') |
905 | - f_search = field.getAttribute("search").encode('utf-8') |
906 | - f_model = field.getAttribute("model").encode('ascii') |
907 | + f_name = field.get("name",'').encode('utf-8') |
908 | + f_ref = field.get("ref",'').encode('ascii') |
909 | + f_search = field.get("search",'').encode('utf-8') |
910 | + f_model = field.get("model",'').encode('ascii') |
911 | if not f_model and model._columns.get(f_name,False): |
912 | f_model = model._columns[f_name]._obj |
913 | - f_use = field.getAttribute("use").encode('ascii') or 'id' |
914 | + f_use = field.get("use",'').encode('ascii') or 'id' |
915 | f_val = False |
916 | |
917 | if len(f_search): |
918 | @@ -761,24 +749,22 @@ |
919 | return int(self.pool.get('ir.model.data').read(cr, self.uid, [result], ['res_id'])[0]['res_id']) |
920 | |
921 | def parse(self, xmlstr): |
922 | - d = xml.dom.minidom.parseString(xmlstr) |
923 | - de = d.documentElement |
924 | + de = etree.XML(xmlstr) |
925 | |
926 | - if not de.nodeName in ['terp', 'openerp']: |
927 | + if not de.tag in ['terp', 'openerp']: |
928 | self.logger.notifyChannel("init", netsvc.LOG_ERROR, "Mismatch xml format" ) |
929 | raise Exception( "Mismatch xml format: only terp or openerp as root tag" ) |
930 | |
931 | - if de.nodeName == 'terp': |
932 | + if de.tag == 'terp': |
933 | self.logger.notifyChannel("init", netsvc.LOG_WARNING, "The tag <terp/> is deprecated, use <openerp/>") |
934 | |
935 | - for n in [i for i in de.childNodes if (i.nodeType == i.ELEMENT_NODE and i.nodeName=="data")]: |
936 | - for rec in n.childNodes: |
937 | - if rec.nodeType == rec.ELEMENT_NODE: |
938 | - if rec.nodeName in self._tags: |
939 | + for n in [i for i in de.getchildren() if (i.tag=="data")]: |
940 | + for rec in n.getchildren(): |
941 | + if rec.tag in self._tags: |
942 | try: |
943 | - self._tags[rec.nodeName](self.cr, rec, n) |
944 | + self._tags[rec.tag](self.cr, rec, n) |
945 | except: |
946 | - self.logger.notifyChannel("init", netsvc.LOG_ERROR, '\n'+rec.toxml()) |
947 | + self.logger.notifyChannel("init", netsvc.LOG_ERROR, '\n'+etree.tostring(rec)) |
948 | self.cr.rollback() |
949 | raise |
950 | return True |
951 | @@ -891,11 +877,13 @@ |
952 | pool=pooler.get_pool(cr.dbname) |
953 | cr=pooler.db.cursor() |
954 | idref = {} |
955 | - d = xml.dom.minidom.getDOMImplementation().createDocument(None, "terp", None) |
956 | - de = d.documentElement |
957 | - data=d.createElement("data") |
958 | - de.appendChild(data) |
959 | - de.appendChild(d.createTextNode('Some textual content.')) |
960 | + |
961 | + page = etree.Element ( 'terp' ) |
962 | + doc = etree.ElementTree ( page ) |
963 | + data = etree.SubElement ( page, 'data' ) |
964 | + text_node = etree.SubElement ( page, 'text' ) |
965 | + text_node.text = 'Some textual content.' |
966 | + |
967 | cr.commit() |
968 | cr.close() |
969 | |
970 | |
971 | === modified file 'bin/tools/translate.py' |
972 | --- bin/tools/translate.py 2009-09-29 12:39:23 +0000 |
973 | +++ bin/tools/translate.py 2009-10-28 21:05:25 +0000 |
974 | @@ -23,9 +23,9 @@ |
975 | import os |
976 | from os.path import join |
977 | import fnmatch |
978 | -import csv, xml.dom, re |
979 | +import csv, re |
980 | +from lxml import etree |
981 | import tools, pooler |
982 | -from osv.orm import BrowseRecordError |
983 | import ir |
984 | import netsvc |
985 | from tools.misc import UpdateableStr |
986 | @@ -335,9 +335,9 @@ |
987 | |
988 | def trans_parse_xsl(de): |
989 | res = [] |
990 | - for n in [i for i in de.childNodes if (i.nodeType == i.ELEMENT_NODE)]: |
991 | - if n.hasAttribute("t"): |
992 | - for m in [j for j in n.childNodes if (j.nodeType == j.TEXT_NODE)]: |
993 | + for n in [i for i in de.getchildren()]: |
994 | + if n.get("t"): |
995 | + for m in [j for j in n.getchildren()]: |
996 | l = m.data.strip().replace('\n',' ') |
997 | if len(l): |
998 | res.append(l.encode("utf8")) |
999 | @@ -346,8 +346,8 @@ |
1000 | |
1001 | def trans_parse_rml(de): |
1002 | res = [] |
1003 | - for n in [i for i in de.childNodes if (i.nodeType == i.ELEMENT_NODE)]: |
1004 | - for m in [j for j in n.childNodes if (j.nodeType == j.TEXT_NODE)]: |
1005 | + for n in [i for i in de.getchildren()]: |
1006 | + for m in [j for j in n.getchildren()]: |
1007 | string_list = [s.replace('\n', ' ').strip() for s in re.split('\[\[.+?\]\]', m.data)] |
1008 | for s in string_list: |
1009 | if s: |
1010 | @@ -357,15 +357,15 @@ |
1011 | |
1012 | def trans_parse_view(de): |
1013 | res = [] |
1014 | - if de.hasAttribute("string"): |
1015 | - s = de.getAttribute('string') |
1016 | - if s: |
1017 | - res.append(s.encode("utf8")) |
1018 | - if de.hasAttribute("sum"): |
1019 | - s = de.getAttribute('sum') |
1020 | - if s: |
1021 | - res.append(s.encode("utf8")) |
1022 | - for n in [i for i in de.childNodes if (i.nodeType == i.ELEMENT_NODE)]: |
1023 | + if de.get("string"): |
1024 | + s = de.get('string') |
1025 | + if s: |
1026 | + res.append(s.encode("utf8")) |
1027 | + if de.get("sum"): |
1028 | + s = de.get('sum') |
1029 | + if s: |
1030 | + res.append(s.encode("utf8")) |
1031 | + for n in [i for i in de.getchildren()]: |
1032 | res.extend(trans_parse_view(n)) |
1033 | return res |
1034 | |
1035 | @@ -434,8 +434,8 @@ |
1036 | obj = pool.get(model).browse(cr, uid, res_id) |
1037 | |
1038 | if model=='ir.ui.view': |
1039 | - d = xml.dom.minidom.parseString(encode(obj.arch)) |
1040 | - for t in trans_parse_view(d.documentElement): |
1041 | + d = etree.XML(encode(obj.arch)) |
1042 | + for t in trans_parse_view(d): |
1043 | push_translation(module, 'view', encode(obj.model), 0, t) |
1044 | elif model=='ir.actions.wizard': |
1045 | service_name = 'wizard.'+encode(obj.wiz_name) |
1046 | @@ -522,10 +522,10 @@ |
1047 | report_type = "xsl" |
1048 | try: |
1049 | xmlstr = tools.file_open(fname).read() |
1050 | - d = xml.dom.minidom.parseString(xmlstr) |
1051 | - for t in parse_func(d.documentElement): |
1052 | + d = etree.XML()(xmlstr) |
1053 | + for t in parse_func(d.getroot()): |
1054 | push_translation(module, report_type, name, 0, t) |
1055 | - except IOError, xml.dom.expatbuilder.expat.ExpatError: |
1056 | + except IOError, etree.expatbuilder.expat.ExpatError: |
1057 | if fname: |
1058 | logger.notifyChannel("i18n", netsvc.LOG_ERROR, "couldn't export translation for report %s %s %s" % (name, report_type, fname)) |
1059 | |
1060 | |
1061 | === modified file 'bin/wizard/__init__.py' |
1062 | --- bin/wizard/__init__.py 2009-04-27 08:46:42 +0000 |
1063 | +++ bin/wizard/__init__.py 2009-10-28 21:05:25 +0000 |
1064 | @@ -24,7 +24,7 @@ |
1065 | from tools import copy |
1066 | from tools.misc import UpdateableStr, UpdateableDict |
1067 | from tools.translate import translate |
1068 | -from xml import dom |
1069 | +from lxml import etree |
1070 | |
1071 | import ir |
1072 | import pooler |
1073 | @@ -50,13 +50,12 @@ |
1074 | self.wiz_name = name |
1075 | |
1076 | def translate_view(self, cr, node, state, lang): |
1077 | - if node.nodeType == node.ELEMENT_NODE: |
1078 | - if node.hasAttribute('string') and node.getAttribute('string'): |
1079 | - trans = translate(cr, self.wiz_name+','+state, 'wizard_view', lang, node.getAttribute('string').encode('utf8')) |
1080 | + if node.get('string'): |
1081 | + trans = translate(cr, self.wiz_name+','+state, 'wizard_view', lang, node.get('string').encode('utf8')) |
1082 | if trans: |
1083 | - node.setAttribute('string', trans) |
1084 | - for n in node.childNodes: |
1085 | - self.translate_view(cr, n, state, lang) |
1086 | + node.set('string', trans) |
1087 | + for n in node.getchildren(): |
1088 | + self.translate_view(cr, n, state, lang) |
1089 | |
1090 | def execute_cr(self, cr, uid, data, state='init', context=None): |
1091 | if not context: |
1092 | @@ -132,9 +131,9 @@ |
1093 | |
1094 | # translate arch |
1095 | if not isinstance(arch, UpdateableStr): |
1096 | - doc = dom.minidom.parseString(arch.encode('utf8')) |
1097 | + doc = etree.XML(arch) |
1098 | self.translate_view(cr, doc, state, lang) |
1099 | - arch = doc.toxml() |
1100 | + arch = etree.tostring(doc) |
1101 | |
1102 | # translate buttons |
1103 | button_list = list(button_list) |
Backport from trunk to 5-0 lxml + etree standard Python XML libs instead of deprecated xpatch and dom which are not maintained anymore and hence cause a lots of headache to install OpenERP 5-0 branch on Ubuntu distros for instance. Now, OpenERP 5-0 wouldn't even run on new Ubuntu Karmic because no more port of the deprecated packages seems to exist.
In any case, the fix is clean and easy to test. I tested it extensively, I ask partners/community members to do the same. Then it's safe to merge it, it involve limited piece of code for which it's easy to find test cases (use demo data + manufacturing profile + translation for instance). I've been especially cautious with the merges/conflicts and tests. You can trust me at least more than the guys from Tiny who didn't even resolved conflicts properly when merging that old_trunk back into new trunk branch...
We did it carefully. This can be tested.
SO PLEASE, TEST AND TEST AGAIN, BUT PLEASE DO MERGE IT IN 5-0 TO SAVE USERS FROM INSTALLATION HEADHACHES, YOU WILL ALSO AVOID ALL UBUNTU USERS COMING TO THE FORUM AND ASKING WHY THE HELL IT WON'T INSTALL OR BEING DOOMED TO USE OLDER BUGGY VERSIONS OF OPENERP. YOU'LL ALSO AVOID DEBIAN/UBUNTU MAINTAINERS THAT ARE NOT OPENERP SPECIALISTS AT ALL TO MAINTAIN THEMSELVES UGLY PATCHES WHEN THEY HAVE TIME WHICH WILL HARDLY MATCH STABLE RELEASES OF OPENERP.
Thanks in advance