JFIFC   %# , #&')*)-0-(0%()(C   (((((((((((((((((((((((((((((((((((((((((((((((((((" ,.Fh Ch@ 10D``DBB h4 @dX bD iD ІI$TBB'$"`I)Eb`(m9@0hb&!1114  b` Dh "lTH)TAiN  A" hf%n£!aY4hcC"5J2#Tզ@ #(a`QI+JHB8h@!!!hSMNhC4$11SB!`&2Dc(p*`"XE b!IJ&0C41 b `hL0JHLi1L -XX`ݚb% )*Cp& ! $40)!b䜢hC@D 6JJቨ4B!`b `0@ b`&ё^IÆ LO7dX h@)A "I`6H !L'@ DQ B!Bj4  L@ @hb&%$ D LQ~7ҜtZ&pӘ b `&)F؆` 7DBB&qI:LVF2B1 5iL4$ mj4 @ @ b`0b iS` 14V1l˦I7 @` L&ȒB[lC!FlIY +@!"!%$  HX J00CبDE18! L r2ϳ>Tس:=8Ӓb  & !`) "0$EMSIAL6D$B`&BBX&1C CT4h! @@4 0Yf |,tCE\T}nn` b$1AN&$ &IS`0118` 4  9_^8B14yꞿ3wlK 7 &@ 0@ @ `Ȓ b( +$2DR:]Z3cqcAȴNb11@#@18b`!upyt|z8lZ+]}3:zKcwA9SUU5AJ   2LUp*HR+EUEvF2qIW8)-JYDUQ  b `16!B& n$I9y~yntpX"QE,m[&C44 b ``@  BQ0&:Qud J7*"S-5(J7U@`  b1n.2/| ZrJY]3~ڕyצ1Ͳʬ3}[9NΨWVun}Tc~g6g=Mq6}GKsx b``L!nu"6ڬQ}_4 4IMtSҫ(610 b`],k4r:\_GOn骻q[,C*ͳԖzhUݐ9w L01 L& hQm(4d]nNiF wfG&ܱx*uθIbBʤSnܢaFj(@`8箄Ꝿ&IltgxgɻM%Mږ{z)]vSqUټ& b`0CT 8&`% '** -L/(4$cךRjp.h @1b!  0n7ʮB Kt}UF˞tr\7Jϖ~%Ҹ[!hUqp!&7Č1] *O4צN.Ǽt0!J%S101CC&1 Lh b bey ?fW7Ƨ,ʒ2t}֚m[PzvvF@ʀ  hb @%(#!!bBâM4BF=x Pցdd'YS̷ͬ 118h`bSv\>}Ux/ޝ7UI5h,pܞ^[U9=&v8@I!(XjaS,S3]av(KWP4j` -#ݒ7Jն&W"1t!^0 ! &X2y=yomNz.zVwfKݚж26ϗMa5L0C]q$8EQTl;yj]\U:znT62U f%uLb!  o7Q/{jyϣCJgS[oޮOO>_W6O~oC,2T`44\3zc(B A\cuݛU4컗AK2B6vǷ\n9WXQ,y:Bz` `4 @C&r_'RdxyNu <SQUM+#S⎬7v㦩K]Jy:KX5b`!!)*d 1RYn+έӚKUJX7U3˟EA}lŪe6@@  b b4x2\>|z^WvB{3^S׺Np^kέ㜅VզhW6rw{xz=)@h+ !daYZC.~mQniڲ7|0Qgj_J}l;8Po)Ά>4 @dtsNqОgͷ>ǻ \T`ыfNf7(pu9|]͙c{#(h1 @ @}6yn;*SHI*Bj"9̻&{y]4գ7>Wf~םZ0niMRsTH/NL` @ `/9ywVY-tkZJ~sGCz|z[cV-KX+csSTWu6kK2"2QiM b6y֝^]k׍ʻK=U**MVK2R.ZE9}v6{i1m]jZҌRUJ)De%dR*K~eS>-у$eͮsuκh%lGNl8#~:n5Yߎqf?L'@ @ojȲ*d.ܴn3q$ngլNKbS%{ߓ\qM(zOk=R͕zX_~=hE'J]\YA&]ƣLk4>5tdUFm8ʋ+7T+K-%3oU]kRKV=cNjkCiGY)s󝧂뫟CX=na\^ RgOA5F|-P_ew9jWM;暜Q}rUh;p_>|+ng<%̙uӧ>phss.SE67FH[W+8sc<=3Z_FJ^Mz('.Rǖ=<}<=hr7Z6v"pV-:jS٩}vf2UeYN\K JN*|y.!~O{ k#;1rt݃:>8sVL]*gs*-dY*Wdnb b&@?=1Ms*|ZW3VY.+ӋcSZg EWfgvZNDeSBWʋ$ӟLu?CԎvܚ/\hُR]zu3&UWZRvj^l[֢3u[ةZ2=Ox]wԥΛbyu͝p뚫3UsaVX;I>7~xgpa;_կM5yĔ1dD׳<K}*D&P&@18{N]n)E=Mg_811YGE) "J cMQ]e3>_Q=:f]IzTQS US-izΛ$Iv3Q]]JM$[VT *N5-eBHJO<侴euRVzseOv--m(JƬi`jKڹW+n}1Z^.sLyq9}4/sw@ZH!]M&y،l-nq沯Ets'mi9E: Q"Z 5ֽC^mkV[ʝ>]3n2,#\B `T(U6-N,gF~&[bB^w*<=UÎ+mBePW:IPڪ7䫲anm J0 Pg=iQpڎz\~-kRqXl9]O.w}Ku&kSuHS $BRee:̢r fnYmSE9Hr3PQuVE 6AM "vty|yU.Y!nm4kqB.N4UdF鶫,qLں[e ⒅kYknpwBϓU>^Ѳ+214E8,:"=YվٛG\N{UǭJ1؆( -Rd [ۏͣ1f^6%fF$sB̠YUӲs]0 &\Z\_dL)f{!f7}6_w5SYŵUUYe]=73uԌybv#3]ё+fXx?ί'jĪZ'KZCOmVg ٚ5![omjbїxue ؒuU̔g5ziW:7':]Uˎ:ur;ês솅Dq#$BGVQ}cWQd.ŋZ5yrhgg^1ʎxGo|u?=%[V63fH41ӿFBwwnlӯǵ*vp$FJdi::qӏ^|{sF5skb+b;+ɳǽy9mIAJ1ɚz9j]<+htU!lNZ`tafcʍ4⁳G/LJ|TZ5%TͲBLSd-.ط%ؓ5ˡæRdĉV bc@$::v֋oV\fwtr~.V:2.8n.YX͎hk1.Jvտ}ڸm볧-%\s^Lݾ}fƥ<;9 o-^,/B9T,ųXҬ o,4 hxiӛfR-zlFfR&oSG/G=fl"#o %$4W٫#1e;Y(62+W4:lt#:;1[G3YfzseN8dًI8Oy@ԉ``&!#8Hs3_OFRثRulvth;Ì:dl @TqVR* ˣnsuX4%y:f2h]KƣVi%:f'w?LkU?,iÑIg]B%6aUiUg&>zuƧM_5^^Z役:stNg\Y+6ٞEֹgZγV5vkD-d=y55(&: F%`Ȏ-@ 9}l|dNPGDWmp%܍=mbZFlӺ23jqъuپY|| FxiP+$'*싶M+oEșPBf x8O;)3:!319t5!K kϥ:o 鞖3;=QY٣ܘ0JCM`I5f|֭sb)[b6xe8Ne!Bq2c8&(Nv񺭁TmdB6AI"^OOA(D#4o,i󶞼 ^ϯɽEz{κչ  J2# J0lewn~̚!)N(џLbU9:x}qҲ6m~/LmҘ>F蛖޿q]V FbRF|qV]ب5ltO՜&e\u5N\&\تP ʕ^dKN}!F'3ԌIT-!Ќ\%||&zcy].:yٿ,n㨍vL1I"5I4ЇJ+y_4t[Aݦ>f:i2\2eP۱kqED1g۟NxǫOMJ4uH\EūB ]I!["IHl>GW t0peEN]2_g:nm#7S{qR7.ŲAVL,qhJ A$n,iօ7>]0g3MiKkK^#PJ8@LjVD,kU yz̪|NKυI@.v}5wy}~cLIWw!o )E(JT1RjґW{!#4}g(CD%bJ+WKO+ &3doFtr걤Zabb!ͫ7%ѯךU-Ăj*ÿУTҷ=|<=X[q6*iC"(d'"$- yyTnh-|z]fSn'dZ1Ky} />u_3\8 Nz8~GLP;iHvL@`SM"1`8x`q/mAI}E9qOןơ^r2U`JP,cBkW!$I)d+bܩir+уXJ-)~tc>&ĂVB-K_?z$. h0R)F@9"ʑe>\z\;5P:M9u9ɮsaOz{qҬsq6ȦN@gm ;\$8' #R#%M_28ІU[j,#"˟P=++| g!4n^䪶 i5P$ϮYCc`Wr^010#Њr3$H ۀ29# ?ӯ ,q=ی;G0O,, 4A@83s3o !<5-׼ 1?430D$a ;8cO4 ̲9G&o4 1ͫ?8<3w>9? 6 8E Ǡ~ߙs,< ,/1\O8<:Հn:,ӽDb.4'8+Jr<<9]+rˑ0 <8"CP/ < s c?2<O;x7}000 Á(N5M0ϯFo<Q!w0 # L4Ҏ +1`=LѨAuM 8 @h  Ϊg0[8d_o|n00 8 whhtS/-ŸsC8 0 07o8$ڍ"ʘq{ T2ѱa0sFsrљu[ ?Nz2"8fɒ{Oc1+3vzM|"D:I}KYaLω` 0 G+(+f?)ŖR+}0q@{1'7#:w4VO0 $βէFS4LBer JeN*/ =A1=$l\Ӯ@j.檄kz%eqe^PU콹4x=3` X?Rʺn.Z׍x)y"ř?21l6oW5O䐘eނ͠@{B2y^%kZ*ogxBVW`h9mh]zXX,нP,ۍ44&}=fJ4E6~JC 06}+n'Ui1᠗$ClLE՝)[T@Ub̶&R3[gXPB =J(B41|xs}Px蒲@[5"J۲syo#$;X#L z\,;tEfwҸ,=ěeӽ'O (7=u~*"x(Q$I0Nm5ͬz hEb0?%0+l2ͻXl RH#rA/TmXb̪?>޻|P:}f}Sb*QnW4{5\@9I{;MWjMxs1;1dY~>r[WRlW2 UսKzrIv6G'1gglOrm"(zLfo`Tx0fbhmNW= [c3 $'4jy32`$^vԩWW|[|{TFg4CPaڝ {X6]0[Ö4W`'LqϊJ.,3U[1[v Q!!FuZe$ eQw?ieg]TL-N @X-nqBٸGV'd H- 47O3y=Q ,swwF%"wXMhO{5! p:;K(o;1O6`.9I~hŶͱ]Yqưpmaƾk^'y; S!",`8t侑5qGZw)Ayw/<^?Oz1tӪ($S]n91#T2yJφ |R|3sJ(]U+G{a&Pd>i6ClR|2Ռ7Cgخurڛgs.3uo=p,!5bh-?KM)UzUk81ְ` ZIy6 qJN-ե5ymχ7cl,iX .CR oz⯫y/R褻kPp20%˄c`6HapC[q7C(Dz0DG ϴu{m[˯Ac" i;?vGms$יg,h?(sc}^?Z׼s8&IGhDm?Kosy[r)| Cq{د}4} /{ePE4]s։* -?ۉ붻]:+m'Wum}njj(l*J }mKVT~ 5o|YYLPK,"jC*6i<}}mg*ޏ<2cs|㐓bI/v}Hvw gM$YQm<}}}}ڡ, 5`na%mv}]}UhQۼu,0<2}5uSU[-3lMUQ }d[a-9qLԻƥŻIu<<}g}mD#͓[}3qԗq[\^|+ (ŵP }qqe5=߷ (Bt597=#a*8^ȁ 2y`Åqi}<887w]Գ=xQD\}Dr)XI 1ϻ( cK<u6YqJ|4tu<o0[$-| 4&'=M}R&ʨLs.Uij M\_҂B!wh,o3g]o4Q7u ?o.\o(iˣ&CpLTz7ʙqoyv1 ԄajAĕ]ȘYB1/&aжiv1$J]I1ڂ#y ہ+-AiVmmRYY̺}S*_އ #׭qrBv YR̿XB=kqYH8Dvq%=j1 PUjץ0,#>!Tף,8Ns|i,:$BWpہ0NNʼTrJkY?4@K_oYa @1沩(SgaA4Q6HwF!J`7pVhָe"֬jO>$,JdLTf9BV;(L\ h7 6: /[)+R1.?`2UM|r*Mѫ/-?H@l!M*"% d͖<HbģHo몯H^nTG[-9#%9I"9MԾtd%yhGN Z`˿LJܘ1 3Zޥ0ږ[z hfɎ$X览N7a./m ՖG]8_:)]`9xw(F.&n$6NJ?[^F GYdn΄΋9>z nf`~@lֽL&".qj'1q8hWݎ's@;B ;fdBy|q=S$`RD>]F ig ^%"MHt4SIK+fe Tnf/޳tdy%[1!Jdx'@^PMoxMS{TPfB|^*}'sUC-JA!AFx(i؟.C` \nx<졆|nxYPd(n`/fL#2t>#DũE^?кq OmNkoȚ6Y?7*&-BA0QRj`鋘LϡL61O{˗&T܊TΛ7 q5tfԯ09mKxt\6j0"4x/\ҙ UL}%jXƄ QYgк87d]G#aPJHrCN\xKg 3]Jy1`\` Ә![MͅX\ΖABn %̃rd@fL*tf]>]x*G~|ˀ`1|>;;_`GEqIؔ嚜 o3TrUsqǣĭ`FC1No}~`?52%f o5P  B楩I<$̙G.4v|ͬa,U+)7v1yP&_6WcBa1g$љWx5G!TxHRbL>}UȈ26KNV_OAk-eT~0 ""3›O1Pg(>L<^F,hF㘘ټËlauVGW5$Švչ6b.3N?/4Ow!~& <~0"1rvb Qf0U5Ax=Fo3S1z9f|1/&E(q3dq1&F #(Ua<DM@Af.gI똏fnD$;2?05(B A) >0Kp|~ED6EFZFv癩>Pڥ0 ӛf. Tk3mGVc0Qf;,ƥT`B"ߖ7*s iN*3("U FbqBbZ - ma_\X5 3.(drb;R`@58q@T&bfj\|{Tӹ$4AهB9itbĪہ̮D{ud\%jc5Ɍw,Le"m̠\xꙮkC ʔLD鷕x*D1(~?P& f&3Y1[c`J LzARspa\|t(TWJlKT"z 3 +ɨ80&]>Mg;0Ll&Ll61Z { MRSn(-=:fP& 8]!ryI'U,ynX\ n?sB9$̈́ u6`6/Y3Sd%1)w< 54QO8nYSMd&@k&<[DkcQ>&\ [%N:VA g08AyG淟R4qډswD:AOyDm<*\Mg3zA0c"țDɄf. ,jH#U6;U}1M 83y|kcBk889?Lچpd4L\W 2g&SLue~2-2!¦"Ӗ]:)&m;4bA?0k8d?)}AkV,&E(hڅ۸2S5Aq0F"UbYө䉗JF:6ry.6CMw> b&'5M{P|]T}8_3P,X Bc{A(7g9кӅr(55HDRiS~I:M_Ǐ  ZÕZ-k54ZV3Mg2/̹ڢiYԊD͐c]#6чu>Lv"pAFaJh]Q,MFfܢ&)>@CG(X%ߒhF(m.U?i.q.مw2c('kXIcbZkXD&#lGu g\)KlP#B`P[y}sR(UpAn%MW>fqԳM/|5FOm?ٷbx*ٚ-X9BP16ճ0ԲPP^cuyLF*e Ù­ -Ar8ybf&l$>!WQHQ.TIL ,?y"zmճ65Vq|L_"Sgϸ 72Ϸ#5Rܾb{["R>#9&e0Nf|6s5 ]?3:`"Γm@'gfLyybXq #=7VsE2%ĠG,y,n~"T~ _eXt^Q70ٝA7%ÑAf%3Vr(ij7@{_a_}od橶bdgGGɈlw˜' u.fYRh96i[9!>2p@*1&ELf`B~TM3F᱂\N`J AAH |e0sPJ|1>'L1L t nLQ cqsLUIQu4^a )u; о`vNbJP!1&Œ,.Rݳe(Gb}ޠ06mŒ`Q5ܙ#[`hP 6 cu 36`־& :3>SmxuxNMFdԐ9$G 4;++!>L(bdGRG pgʟ)3cܻ ~e3Hdiѷ)"T??ŋU"aENbw .SP@@_&l*1<8arcj/POܰ!48v ˊ|w3PrՄmCju594"~Q:LA[&-ϊc:f#YQ)="T{ajW><>D8[ǃz ExF6IOb"/"e֜965 &x&c̍h ~X9|K`01gm; 0md<Ӏ*3 >%]%Ӷ>4\AC85LY|@(vu{]:d5c::VzjzF]g.1dM3]bQn* "'6Vӯ28 e? Lu3IcaZ|T؉fm5:gyd9Rq:nǗ0:6ZvsB`G˘Lh8Oèf^Jk깠kMԸ~"nCop&. mS̹p{3b\ML;1|op܍l~_~&7<+\4g a+2 k_ߙ[ S7"ĚU@ѬGP&=7]=(!>cdːa,:~\r?i>ij28̵=<vm<)pc `1CهX õ{0t91 _`'cQf(\ehP" #YA>Q>yzS7Jy 1X Bp8s,v|G";_1+ks7#LZpڐ ?81|+fB\OB8<aMf*ȃϻo~5yֻ$ïb91LA1{xU5") E`+byg<\_lZ3aE c`;O0f,KNDd0XHN3tW3 OlHCaoUܒ&"Pه[2 FZW0'KWOt/ A δ 㹇E5L,JJ6DmZcdtRf w'1h_bc|ith\h/{XWfg`r#v=s<\ĻlkRkaٚţO:[鿿ogʡ<|h1TqNau3.,Y`c! Di3qۙaVh~ˆE<4бWf A{g2rL>8ljocMCAMO ';Q|Tv8&h5nqW IUdO\9P6y<fG&OT|8А&-22fp\tl~4zllCq] L9wB tiX\Fܠo~h?y/~AP*~ OãR(q` SfKN gfE]4hɈ6c Bkܮ3p=; DBAg0? =ˁS|Kt2ci4F3gJpee˪*~qwՐۏa4b1}S55 Cs EbE˸Q#4yCv{L^%XЈN6 ǭª&H*qsWv+gFuAƢy)MfhB2@PC '%}k"Lϑ9"z]BUΝ5@9&5';\>%H;u tۄ8V`zo3{@>'_l6dG+f:;A]BM;@D87"u@r2}t[ ¥4ll&>r!\O6&}n0!=`8'pftؾ=k1Yf(|uOrZn4(cuQɞ /L - T؊ۅ<ZEb*~&vߐ%+Fa*YSFM7/n&d5&i\>0@c"h#h n` `8 >=Ȍ(u`RǑ46`4{&R(H``7 abo"`ݻ5,('7j =5f\ '``g1|L˜"Vi[3HbT1g>`|č米?5_r~IjY 2p=kX0L4jLP-!Pr/gQm37}魻N}Ri`C4ŋ6,]R=ӷL#)(eF'%i&0L| 2UOULjliIP|b:}Bd_]vjvSCWe5$Q0>6!1A "0Q2@a#PqBR$%3?l/;?(g=T3iMm#D =>J~¿h,%_\rB>Q_qSMi3*:t(h{TR|aYR[oϧESFZ5`ܿ07a_8")&])5cbzԯF7KGz(JHP(F3X>?T6ʄJJޞJ dͰp&a)x]R~7NɘY18hHRĝL|2~#갢Sn<ً1ѓr]ٴq'>[\LoQ`צeBTf[ٌxmcgr`_ؾ!ݐ660-EQ  Ɂg@SC^&\z'Q8B= a?)?P:U?N@*>4}BrgX:;\N7jМ QK&ZNܯT6a6oa㸍練0d8E+`rVuhhhD3q=x멯oݙQfg<x?ӦÉV2?=`͟H$DXt`?TEeS'5g !{Aw~O2k'%8?6 1bmxls48>Hx55T[|G"0~{L`KPT4oU1c6|OF. >"De? FasQ^ʬrMne@3`d4tDDn8?2VC+VxHLeV748M* qU?M7& r g.Ѵ'o&\̀]` MqF*D,hA14l"\"@&T.f<r.2)&}0i#Qdƃ&nD3L@|@r"&#ɍs Eړ:cHfvd"G*fA\YB@S\X Fngt&,Yr*E!CDf mbiwd49Аc2uPr%&PCLlw\EP?1BSO(7#(☎B V0h@0SQrfn!kv?uw5LT!E "+2%}eAv`@Wc͒30+26Tc>fn<RT(9ֱO+n&W˦?UDZJAdQ`ZBAq0e*`"㈊Ld0X36fR@, #aJ?a 00GQ B% jfe WώE7iF ӏ(2}1:&e3A:%E]DPT A(DF=YTUm%d EPy<@ k mF^ft *Gs\DmšmAK列N2?gI8.0#%0 Fb. .f(FqV&P:vhtCwb-& ˇ#Lώ&#E!Rp'Og( xXX%,[V`Y LX!65mG],|Y*> i )4wdDs\f44M5Al|J8 f 3ןfyeSl"3]1X̸O+s"saWqR)yTccLCP, ;qh 4}y!IɴEv{9T2EXٚlG&@&W,j 3ǽ@j&zG&bLTno'cƘ<OLx=?(ؙȵʴRZ/R<֝Mq, nTյ$Yk] e`.u'V-w!h cOc4Y61&Rǁ1)4bG ,Cs 3'B@+1bg[Q4‰'˽&, o'"T5=`UvOj?BrC 8C XP ɤ:kȌMԻ1&ogG[@@aQp34_B QP_ hVbb C:c-h.!A ω 81J[ل'&)(ۣ,'X)\A 8D=Bo]7[{1QCP3&#Ez/gܻc~]q`QRf,eT ͤL=5#MC:.1\PT-8w Gan|c"%Y0LMٛ&L rsSd8u+W/Rè@E\\٦զ}1zQ,b~;"k6)F:YWc2TLnjb6ۓ9.><~1,NEn '74o_(*lD+u wӜٕDƼ@G3(e&lQзd@l. ȪjÓlUٔHn!:l"fL9v5hIu ǍL-o7:7EK.crmը௙ u3c]XGȘS2}#XZ?dO)f!ɐAbfG8T3OfpiNrfX)4cN2"F4!ʕL֠]ZŇLɷ%~e3܄E؈a/D>zzc=5V>L[i'b 'C>R#I( eR@9修euaFt`ŊSN]#bqk 3 j75lU*n}jOMWfUT0nf"!%zːc&gڥJ*i؜BV n|@7:0}bf\7M@9@gPМ`i 4j&p1m5?Qfk!ְcd luR>L'м̄&*}?툊怇J 0MfM) NՎs;)rlT=" (?9ɅZul@;%R&}: ^yVԛ# g-@@PRLg94^C>`&\.G7gйUL1 *: QDsAjqDl2-4u7Eox`܌c˼/*'fd*9㱆 D ;da:K2gbeh4{FҡgL |v3 }P ?tz`Ț&&o"{preLB$5fl L6وAn &3za)ϑŴ(ϩɄMQ\ޣ 3>=v~n ԐVPTWLĹnt̛3W=ØuqyYؿp}:O?g'ŷY(vf ,gQ9AL^?1!+n$֢&*`հ3$Ծ &5)\\@:&0E!iS3y7/URA<~"1e0YFNioIB;?Lpl=1V1w0`Nd`$C#O-ϙp!vkZ˹.fn|NɁOBdԽ(݇_-l3i0FԢ7+fLe9*D h;Ob`F1<"ڑح(E`Owֻc(VUlY{slc5UGESowѹ oy0Q{v剷 lsI6 FƖ8cimJIΓ 7TQsQ9F $h1"U/]Ps2+7s73YO|U|ΠS vcB=Tǘ>aۧjN3(côRƣ){Z;_@\P#؜gK=2͸1-qs:"1+*~`Ρ r"6b&mη$j 6>aTPŚ#1`R FD֔My$fI`ʻC3. >aӱ*1%g'i1lJfxjPðvېnk_%8 Q)RY4SLoɧ>lhđ&*&"ϑWj 8f_Xv3#B:;116&@caS0?`1sR37-b!q7Fa+4cE;S&Z3t;rq34)jQAJ!}c]@>`E3w]ҳ/8pÌe\I›, "1!cjTfb[if_i~ߨ4=ndY(L5cGQ ̣!5l=S2w,π-Fk6&wľ͆,UCJgP9cswbPݍ&ǰq0o*`C ȕ cF+/@B˩5fU #62‰(h۩2b`m4xm1V/lZo}VjiT/n 05w2Ʃ7b`<> 5٠%z"wv.n,LYWjfJ3wBm(w@\D|)d=Yqs?=f} s}&W4&Ĵ Cj}34s\m 0 &, &Lm3 % LTʶ.<x>FTLP"u-8 x (@QJy̨ʂT7?1}?n (6u^#>eh@Nnf*I<)aaԛ{}֊c]f{ UGT;,ͦʠyӌr1#nZE标80a+`DT\˘O(ٔa7`N[#6H#[YIU>'͙(M:.e/ONfRk޿> aٿ`G?~!^a£SP)cPD"l& 6ȇ)1t&zn|0dAc#'Calumb 4 ֩L=fݸ2oy.2I$\@=%?.$γc.U' &6yhc4~DžYz<fc@X~è81]K n DRk Ž &QbƣP pэZ:ljX3ʕ?ݎѲ(G` (z(PԅB~`kS6xM,bmwQ_ 6rMp{CdСG՝K cmؓG'ӲTnTM4k#Q{1O\GeN(QbsDO{h:Kihc'\?3(PBgmA)g 9aTq%#8N0m^T\cXQ%…^(/au9羳:o=3 =n-~Q~މs>0f' dΟ?n:cW=vKCiREd]|E9=(faݹ[9d* 8۳ݣeJ0}BKtLdIPGcGTıa+6M/" e'af[:^ 03~z?7*TqG Px{e`Q+yh'*94omJ(1W/Ɠ^+LJӓNJy-_$^kS h&fn!fusB'j9pV!8 0(蹦*9U4Xgc{ZUVӧw=Wf8TXңe\we7D ;>@SZ;VjgqHjP9 z#"bȯpcv)(et+w4gQ ,{ Ži$5EX:m/\ntXG\Frv~,Ԩp5! cL%O*Lx yŦqdGh 7cnSz L mgZl0G\dSݼ.UiVm FVcNH9dעlUe^,.I&Q&] mѝL/賀A5aS4s|CQqpoIH;|e_ -!Rb}/2tu #r@Uy6Tܻ0Phu]Ļٹ_@lrLдt#T1OS76~uwliΧ[L}LLiM9!7_O? IuZuNv,/!P25䷴CrwhcVI *U5ϢDiYLtaUvp}:'{?]fmst' LzMVrMMʮ0uFS naW7Ԃe Fk;B>Ƕ5D7w2&dK~ُƂڴ,;2Z{CX S5Mh?-$JfID£&x,-q#DQO2tj@GI @!Sv.2JvY*d &V`~#=27Z,TrdBCdFW K Tz;x5qTæʙ`;*kHQ&\JyݳKBt fL :|zt^c;`3 +ɀ-V;t)>au?7Xw`GRwAWHSnNeII*pU)ai0i8i9ͧPѥfcXUV6zy4Hg]NJ<-=֪xbS3fYOC. Py֩+kYZylg 9j:D3d\އoipwa6@nbq#\A}#R5Bl p2U]& %QΓ~3TLG]BbIU MӒcl*T&t9c=eC'~:;6x5@M4 [BnդM0I>򫅨VOt525j=wS1 0b L{mkEUm O *c T̸AiD? N;Ӛ`s^ǀ檓N|sPL:6HI?Eޙ'sf2M}'TjhhwL)? 煨 J Nu'䎉 ԅB4NU|)Tm<8B0k{2`i &KCe~Mޚx!%OA!Wk|!;T$S%L;kiTxG ײ4AL&ТJ.#4hmGhZ$4LtUeˡK*r{:s^D\TwPnQ[t\5ee=UG7˞je0GC,T8|P}(lsjhvwdOWgUtr;\DM)eEOHM >Rxs|BM:M}6aFbD eQ1[߽?P 9ke6N|hyD:9jpj6?ڎ-Kz*sDKay %^цAoyK)4nj]ˮ78uXjz5"汲!QQI=75iͳu}Ձ;V*}!TKC8nys]0etPmaS`CT*Tvb:ױ9*g<L0ʧU|!&eaL'϶DM$g53ZLk)#-S'wDcwE{0G;Ii#6?CrnC +Ϟp$puC=2oY4t 4FC9/Q:~'ET\t?T󯺰u8{C3B+'^XdamZZM<:}imް׎D+rKanUi*e"ςZs2潤CsS]sNRζqFHh4asۍaxJG9 ̕"ks7u)ѽLOE>>.suU?&hTUy}T!@& ?%̣| K]yoU#p#ZUDdNl9!R hNNg0[[o&O@p/ݯC3ȦRf+9إM)džA:J7@ˌqOK y]^3cNDѠqkXX~Z*AW,&קk*qglamFtqJ-'B|m>hwk!6!S)yBcW NmUZUIˈ.Ϫj܌~vFG$Ba;fVm:k x]^GN 1t /Tۓ@h+¼rW2gOݗ.Jxl"y%MzE1[uy,s-p,MTo8xʯ k $1Y:ȉ&̨;ÃVߧi dˠ "9BOȧ訸; i]c-{p ԉZ°1gif2nv2ZCW5[Pup|i ԜRʨX}I4w@Mny'K):\全9O%KxA Fn_٪*&i {%.*È*m &gR$Le" .$8OTT\*H:~F_ 5BF1ͱNrk Und*.|sޓ=Ld*6x2 v@)I/S5=%_ACFTثZ.xÒכxfd']%J܈#Ul4:({.|uꦵ'j{a~0r9#U4!apAd@ȣ190&hM_Ē)UUmsM%d 3o8ɵ=k:T*qLwu]h]chʓK9:_ө41{y.n|-@Wgl(:\D4cm:d Ow<_T8biS{d.ڃ(!UߴUMo7LԨ}mp:O: ^5*|MSK~`zSm\T ]& Mq+HTK YQR`s3΋ "'NߚܷunKv2/o㖍`vl&iA:VmE,9zg`̞>G^IwyTШzOM%4&@*O4; ˾*=}Bk?BaoA6 Csns M`Yy,"=S3{qn*'ڵrrՊ||LUHnJC) ve1/-ѣ@U[Uȵb*omx`n.Ӵ>*SLUZ *Ea6L\UR\w{g6Dhk[g)LdeW02BV5U6U,— cG `LOU8}~ K op|4q9(\~_=m7CdxH9eMl*ᕉ1Թa2D(܏TaBTUՄf EPC ?Obu'5Hk8G$y*cCrՅrM9Nyk>GTXdnoh4Xj;]9#YM­2ڲ3*joEARzeh@rküOWwZoU^69)'Xی;kߒcN ]< s'{lp Vi3R[ T`LE^ GS: P L}} (႘Ӫ}AlwEZzev#޳)Ȯq}C{bSֻ;IT}6s]:y)\ֻ{`L;%{zFKO}Uv-oHhQ2һK Y~LeڻPrDMwbby'rcTL$Z\2ǸT+*8Qt8]R)cAsdxDz6CDagXܩvk̦uQsdPqa|2ۏ`V w u6Fzho]m&wtX|>!<]Q1Z\ӡ݀ aȄ[qasUM>XrN ~LJ 0H}&ɨj=ʿı ^_> us{39+Z~%iv#03uO_תUeq| f}eR:,E ۗ.{`U\@ dgM04c6cj R3Xl@{n5X1-fࠉïNg~~wsDR:(rJ"\yŻ_o=ʼnZ*ѻ.4a*1uaouُ$I,)c[F@^ TGR&Mbnc\wMw*vVS6yL8džjvU ԅ@Mvy{D3=rYSkC@V@3QbP!q|Vڎ#Pf2{F2}!pT{M7T{F\L_0[5ZtTgi]97Y.ըfJf,s'D:6Yt}U4q-4cB>+ cmƲpvmh:ZGf32'k#o<'' _Bcd'CM#fp5k,;OV=G3Muj\eԕA-e1 ʬ4l9l)hÏUITk- y޻PŹ}ڸ50yQ'U2O@!T<SF6/H z7?XN3 BS)0-9s!SiTUw.HҚL~)\eT{Fly*.,Qp:gy. h-O`]}1ZPkP t:rXϧ^⏉8CM2c-&,y9MvXމCp8fmJuY,tHS]k:&x`!ۊ`r)lKS».R~~J9iuuވK{лO]W ?"Nh(uBJm췍܀sToKM.Cyho6{`CCvYpv9be0U??D *X~ӡ_4T'"ԤX㪥JjtđIu)w犏aT @{6X|Ml.rGf!V3+UȔ %Z|Ujze0ֹ^1OT>0 %2HX'vN+Ҝ!7'hp8+v.,R=Qѻjxѧ'dMH?HnX\=3AuĻ>^G_m 8\K@XAAk" +YsUt5vi?cLhI_5 <)JC /Ak\6,0;&)s h&q9SKU@^5_,ͱ$&nȷOݷڧ 7u^G89y m _,Rg/pjǘҫu0\?J]#\w"b=G%γX7l8a-[QU}r=B$h MUJLƦGS3@*\"B4E /VnͩNw,**Z[V.p9gpXgյ̧kc)ou7Bh 5U@FJ&,~t7)%37 S@sns)YMtn w4'xisWXa>o%kV}G'TU0vg J~lp*D$&vgkTwU%Yi^!b\xF+' ! 3\:G5iDWhGO*iЫwMwD|qvc*5C-aDjp^k V5 (7kKi7ywYoTʨ,B9rriL]60QR *`mVaY.e1R뾩Yn5c/k{xuﵤhCFjپ冤Vh zPab4^eRbgB 4]+_񓪜SC9[QG:Q`Y+Qw̬ v@ R,^,. 짖W _yDu๲-b>.$ )'}^'. {jWR9hb(6IVh ӬxI6Z1U Nm.ޣ4E@f>Ues-)*3 u=UJna 8:~K M-ܰm̧wd+I*imw Ri&Uh>VysnԳ>"2yNv%Pb:T.a5T=\S({*G^EhZ>G5هع өrU 4XC_Ul8 o5 Vو|uDasuDO%-0n5XgUpK+#2UiR6N 3G uBײa5u9gIN+*7O$H0Tn{qla_4O@RÅkXItEGqpyjihOͭ'>IUQu6,vg)Qc~ùh=Ή*}Wq ~ձ|UEYTqW.-y&Sxl_%in&v_y{oTG#%xy,E==ځ}C -oRHxuk L%昦A(qf|N7%Li-+ j#柕O4a_n!'(o9wK{ UOV絽:wbsAbUq5˟$r uL '*75n MB- /u:܈Xj&QNMyNH0TOUJ U0u[$o__ 9hqftY'J8; ?A2F2\dYtWh^ke*ƅp^j +|+Z>DmH B]Ty,#<=s ֛MٔZHsPF\8TiK˹NӅ:h(H\Jc&tJ}jnlo}CI!::<5\wBQK"3Qp51:eZ=Ät*# [)ܺ֙Jߴ?6hӒTkZ*zu|8GSktN|62f'3FOڵҝSH0ZKnj=ڰr֕!U|`1è?-ph >Jr: ~M`ۼwC%(Nn{ h ht]\0tL|~Knhø.q7D G  jANt#EO.`89跜1isCUkATiSkq#O^L@y2E x',(3 JpM9,Q蟲aC1L dWg{x|;vbkd杈uVaT׫knK ˭h,q&VEPf3RygbP2rjS]G>I"AG&$.ϧ$Yk/r{s4ւO c&5 h[(u!7x*5&89?>wu$2CR5F_[>u*=QΥiH=f.4l@-}#54@ ΉO0 l2?ҝ#0iO̩>. L ~X[`L_iAAx[XbuY8GЪT&gp9X٦:&yh 5%pRG%8&jS&LTXvjyn`{<5x4Â\9kچ4Laa^֜X.M4íf%;`G2~s ƛ;aRWm7 y"b0,hSmggԬ\>X:~kVGftRGyoᕼc|װezER4GUFPě.WX he< Z>NO85L5Uꖵx9amٻE$9 PH7~JmLtJӣZUS*FEauV.9+hȧze&73#^yg 4ˮh"7k {af ?ȉ[tZ<-XTʹ q,YXurr=L#iӺj~fL--jkRDwGkRѧUFe+w/E+ O_ XXQ|韆Dh Uw S=2 X[IRL1a.V"lxXZFWQZ] Ht9:uGV5U1oJ0Ktb:2~v:ջ6;Իv4O5 pfts\78S)fD+y놚A' (դ$2YC*9Tu*bXG \/<P hWejgiemmf<5@u 2Uud4i# dm;03 y/SA@L1-Uh0sGDse:CYV͵̭p;Rqꁹ->#Ś)wn(X!^UfL.ȧxx*7Cx,%?*|T Ҥ zͻFJM5?򫦣ˀp>{aЅyzMZbmn4 QʂZV; pKNTr@*bJgFDd7Soxuf>eb 9`\ZM;; 1.NCCs>KXiLW=G1g-q Ku[<|a'U9fF){DU*ݝg+HeJdazDiZu1P /GCD=ե{G%"oCJm*!B."J D>J"ᜅO8ےy͟{@~C R\rS=!0ѧK#yT!(tE"hF$y J.*X\1{ySjATZAy*7,I-p/ ȱkH^U 0<5?6g.7,'{]?!°檃!c/uLko#]"3թSi{8F)u6{QRrjh^e`mQ.k nBZ.Сu wcoVm4+ԫj}793 wz|,;3!'a\4+}X OC:V: v&ʐܚj2$ƫ 75Nmi"K}eLC |fD?LZ DHQ-YLA t2tj81br`yp٢d*9pNLɎmAeUUmFSk9D,Vm+vMN +߽hsC~jMu f^7%xSkO|K`&[%G 25 pH*:xOUP 2ElЎרּ`AzAne%By&O2N~JfsЧ*urJk*}"O>ES9,8;zhUs-$Z*\ ᒢw"3Nmi.D[*jUBրT8wM#Q‹]}HzK]rWf^v4:j[}\5ge7M; ʁN\f2|YaZLĪJ5y$E9 a);҅[kKwFs|tXO>"%od.wXi TV+Y= YI瓳w*.اR=5V u DHIMl&G޽s3~S XpZ,N3.+NCϢ6)TkUCK/y[~ɞhb*Y!֓:}\;OTЈUܟe?Hȷ3AS0 OHd8c^e/+zK]N*0}BDDfh ҷ7u0Ч!5i9gT}hUˋTeg)T?#x,h$xD4MHj]hBT M~ z7Jm':[96K ;o-Ӝa-)BfM'5W(;iWR4*Zs\YեUeGR9*g!y5DqNx}{MZS{IM7G$Ԟ8Vlߛ5V#JSPĿLp22X?mLx7fsM&%vJMߝVq3&"u+i<\$'^cg-@緸<x[:5 i>iЧa̔Ǵ) ei.:+U:cȦ֋DѪh>iL}8N}\ BVΑ$!s44^hwW>:xpY[gPp:*ױ||y[we4d4ٜ±sM9@Jz$cPee>nMSԂn\Lڊ/S57ky-s ADn 57}`.7-J^x)eArW,۫mVHUi,EKDTWf;ܘ@b< @ve`m';Xu:02޷{[PTe@$$4[jqGhy'g@QRx,> O&BgdӦ$U,)؆9 f(*f?־\ڔTMʡZ|0V0]:A5 ǻNjnaSΙi'h26Vo5I3dpH' 쎬*K^ZǫM!6e 2FɏU(Xch'-e`0SшżDesFt \52䫋*} >)o\Wm)0ŷyD41akUK'MVّs'!S\tE[F\&b2Lg iJ{p橐Y颖FSFDt 0N!>yPi) j0P (TELW%  OB{u<0oJ״Nf+#%a⒨Pm4¹M0aC۟А_wUUgaw, 4wE q g$BV;eӪҗi晢J*)t'876x`8uL1˄B$nOC^ Ѱ` ^5.)o/erlRۏtn3oBegTc̷{1ZdWg*WԈ FTo?!Ҭ]0px*,j2SCT+E0%9‡bwaSӼCrsm0sG<UҘB9ܴ6ho!&!UC/e[=*UamVEa`p۷zEIM֞+TtpX8yLaJQr]\OT0 peCDw-k{߂Ԇ* fn0حŶePi{h9L&؀ByVSlpJq98[19*!Uwӵf,R mv3 AY -%i)UTCa4dcc{ydZ8\UnU>O9?v7k!kw]!aiT{ƖP[K|]TYXxzsLT2=.e=Z\;v9 Oc]rW^ ΩR;PLIsX^*-k2Ui1%R@˪uWI]PڦRH'1>OӚCȏ /rQq,cSjwIғbpQ$[O9se; ӣ[ao^hP%Fj'?%/|*FnwWt=>LZ(妨TyVLNl>.ᕏ;CMZTAkcf]QQn傘3 ohdKO|rXJ]Iܴ 8Z+\Vl/a, ³bIWɐ\3GhEmrU|DuԢiRBmUtmrjԫW1H](˻q.s_{ɓ0u4Tj?OfA:a&_&\<=Pi0{5*Lx^4oq: xUuwM@I\Fқ1iz%Ps olUsbӏSM9AT/ʂq%RDStt鳆 ]} }ZdIaM"JaW`gtF3$Ð9x+̆Ɠ@ˉJp,6Qu>]Sh#Lp  UTS#7yP&zǗX9.VsG4LTmHnAE0HP̭knp6p*w901qE5Bvlvj3Q2\z8L!'9+F߳ly'CQwݪ'޺X;iOS\l v-Yn!uXhiϚb|d%bu'2x&ccyJ-aOjwU ;)9BJ*fLseP4fn/U-dG ƈlysR_=*¹5o$8jT#^J{x5CڲJ *MDD:SD+Sp W{U3s'䯜<x!59[7ИCEVRᓪ[ӢX3> 9o)/]/" M`qچqopS\hQwo,Ȫnm~JSdnNtHY57_|Sʬ{rΪH1$+ʛYCFaSi/rMw77NeuFF\uR 1QO-9!9ʧLn'5S: uTE&G&T%ZR9yu'HPiקCZC ȂE;6-_s-ks `Q)6!KZ\yM8-b 4fߺw,Zv؍&Gka WFBteb[Ҍ@sXb7`~(" qqOڌ{@sWw=[k XSsia^O-7:}REĽ%֋bMXGT N\wqf_lh ~zdW 35cNH^- m~^J%Li U*=h*99 dNי m'4 =^ . gU,})H-2=\qvQBiܲ-TNSk槌5jA`v] ƔSl\nƗD Dp˞cU:&Gsfc0ċ[2+[ leUstO0+Mw,M9җOY~Ik,|k UAg*ė.W,M  ]jaҜrȭNU= W$2FP8H,8L;J" M~92~RhoO"2P a0Y!b ơӉt*FNe[48Sʁ#00!„uN!B%JjNS` fm<[L X]u6=murq ) HCŒuV?wݴz*2tM*Tis=ѥ1HSܖDo(yOsH*o1O(CLӜ-v{ˍ̢skU:z+`G$O g/?Z5Maf^0o v̑™d1Ls I|,[%ZrWDJ*خ.>ISME6sT%V͍ UIU5c龠]nssa eisne9w'D 璥y!N;NkF\]uE"lY%1!;@Fn赘9jNV$"5:c)d+FY&gT~#%Vg+[ {<ˉ4JB6ۡe0 #67'i"sa$oz-vy8ndYRyT4 ?ֹ/oSV>U8e &2s޷-p,NAnV ՍCy2skþisx ϼSZ2o S\`<t9"I俁Nvd֟MJFن>*R Sj Zŧz@]Y^wPYZO9ALr~MYsT(Ѷe5. X~*8+-CSih]#H,*=֪n-R4PsGBTE=i'(Md2Sv:mҟ{@DCT`ys*d3*38JwB9'l/v* WpƥS~jwCvVzQEPlrofyFI5<&Qu٫S\Ot#j"'gJZfUzuTkK}+_^ jnq7wS:bGIW&O%yuBC( ؎!U(L vRӞQSgRK%We4ܮlQ'V3r` M,SN5`^ MpR'`*TJ\RXzTMVΪ-<{ȈJim뾉棄{f曊#\kǂJ%BoEwR匩;Uk|r@iV ''Z-'#0eIg;U'dNÇ!hPUFj])Nb&TOlL> V]᧽,8mVĹT8+XִHh:'2m uAµRT*WRz&419tKP8dUVXF7yhGvy/]R(aB eU"ntLl:k}a4Ʃ97~ =IWtF\NM+To$é7@T$ocZ eW]Nn5淘iXBo&HE{JJEك\i,>6uH) ND&} r^S_&*Gk訷wE,gz 7xEu "M*Y OIr*`yU5r^TAQP*+Xp 몴oBuѣOMU570莫z,嘒\5 _CA8g$3PcJ!90V"s`}|2M[f p9YUpYF\Z/ԫ{9J;UYCZ9v7gOk9ͤ*oI̪o:j}1u^L]MډT=}ٜh~r͒ʬUJu@2{MIw sn*,IW0vxO+HtUM:cD2s䍆hr4ө5\"~ix鰦SW!9R)ٷg&)T{LL*$#;lz|Ka{KQ{Y=Rxie}Bc n2X иsrVku1~j)\w! 8rꯦ3sHTa>IvF״r5wODi <&4 S[̑MBtԩb+:QȕBduqU ^o//lt+zL;,ް=NY"%E)FHx&rN 2U M-m2fJLBf LtswŎ ijUo;MÒ0zQĨ;;D^rr_(duty-}Ӓjb~ٽZfa uE2=9F>J/en|?p$Z69;ܿ5%O}Jgx9qpRz]a>jCi>ͪKG;+°&ꆫZQ@dUVdi)Akp{ ENо.l`)W) %hMb~ xvU٧~k E$^NlIth1NkɸTs6RՍD4*nžpN&DHPEc*7xE1D(vFS*Ԩ[żAAJO)>2d?OOvz6~jj2Y rw\RG{!ba-=OagQy"XSs#+v?6rN ouT5c4V7*'s)ﵤԞM9s0uIR$~0LFY}U;iGXU:#2:"DTd*dR{8\Ce \Z b~yަ~mGxћ] ufb}nG%@ BNNl#]5S+D禊Ri><>f\^zrMhh:(VR$ǂ'6jaVUDGUwltU;E7x-6E4Vtk#6SHjR.DgR}:&Ϣ;Ng%15n7\(maiញԞM@t7{ \;C^HB2kS97 fBe+5G2WխRi.{Qh=G0 ,‘vAN$:W35W7U#z쪔Lw__:d*] O%~Sp5^[bhlQs2/)z;5iBq.Ъ}Xݐ7؆7VT, Io8}GnTZ> hXpXv统2TF4hiS{-tX|S4ƆC%jЂ湠>J@@X;6 _4>US6YFY咎0Sja$O##$ /4W4\Nj)r@N e8l+Mkipo~d'5{ajw^9m&M♘3bknYN/x,FWv޺WeaSRc0%7uXLs[L%f|4!n)wFeTQֶe'G u(ӧ<ƪu3LjKIbZp 0oi>ebnT(!4 U*m8 K2Dv%:Xz6.rprn'z.h-JɍXی9> LZ%awj# m26l'DFK٢Uz4m> Ana ھ|+轌&g.i'[ ^Is# LjeW3B&Ys'ՍO N i-2GEA tR⃏vSpd+<G=CSi6*r]6V|%Xs)-M7`l] a&e;CkOC#AYiXk 89љ j NB;=.yTR,7{\rv(U#019?-$NS0=ػ{f::AkI 0|D61.:}3rM|D˺ǒrBU|"}DڜUZ+u3+w=C -Dvnlޙ*e$4ݪ4(lg,NY >8y.-mrjpvbvJC$*gCaC(Or2`Ach&.U{xf6X;a,Bu?X7$O8Uq>ʙ u`Ԯs .7. $stnvpgM<􅀾7{+}]֗d2uh_Vv%E&Lu32lXaMsT7o{IwlߖʂXSgx0Ϻ83PܜUkR~G J\iʣ"|s>AT3=>JȊtۉ [FϚ̔GM}(rA \sC]-eaj8Sk^-QB%5 fsu%]"NjRITxl9,}FcWg׃roŰ_.iuNU{&-N?못OF3F&i4UټMdHtU sȬ%}rjmQ%B*(PBjjZVHżOD:W8s{Wd2EUnL)%6Ub,Y==9'ɐtld\!P֖˓qL!hAdN nmV2= ִ 3 Vnג Dq#T{aꂵ BÚ3eSWxM祖תv 3d-B. Xa;CmXSaUa*yM I=Zݮns|FjDfs@?UfYP^mSLU;mpӲ %5?Ui!a9Z׺ R֑~G)X"ThwsR=Ld7K@]a:g%b(U^915 A'xm$|kZ^i } sN{r懚U reZd4 tZl(hOU0O"4Ҫwµ̔@s|Tҏ{: koV!3r&ZnU}Wd}u.qz#Q !4!s@~~UЧv}Ou#;_"֛ޑ桽Uܙ|kN¯sT #Ԉ]J%=vTe S&9f,Pky1Z r6$?6)%Fi=Rw& L1؆C'4.h0>h_dSB&sxjNܼhahC[5T\Cs|M'-|wfL/7UPjI?W4*HwQĵǝ:UZoHt41 } wB%ak4`v T4f9ʕa9}wBZ2V:$5EbL>tO>˧jkʕAQw?%<_3"LB=#Ȧ,{f]$;kxО\:Cs#=`?]NkHeyhi҃2VB r\i<7FIĺteZhRXU.ͱ)aMa=&ӱ^շ=ӏZJqZWx&TU4DTЪjMKWiWsD(ŚIUAȝ=L..cs/Ɨ݀g'e&:>9oX4.n;pcSt-DHF[7 VW\Z4^X& m h@JWS(_4{6%bnCÚzrXVRa{X*"X=1 d^2(ves̯֗&}j=א⪁.>U80IgHT]}6A٠Z e;ӕZE:O8TϮ;FliCϼZgW mrźOS-ہn碝]TyܧLWqƆ3h>j/ 'bg,=[LǠޛBz)D<5G0}j̀B{H>\JjQUhTTi~ .mQ]u<7;Oo3)w\-FJ g`ZoD)x)9#s.i$w^a9X߆oHj6v\:Ueg7LFp/n\6g}jRCi#ԕ x,3s(ViNf*Ҡ*E혒R%FkLֹn nR):]ҪDCQM-yJ!qCp#CITm[!ScZchvjv75O;WeN|әMIn'{jpdHdQ.YAavlz Wn:'8#>ΛUJ֍tySCfnW6T#)we@(-i@V_ ?6P7 | ķyftx {FJv,sbp? #.5 R/5%4Gf8wBKxMzmNצT.oߪkq M8OZ_)~OK?/~[R⭯_U?]OS<U[E7˹t*r }\= xk+Ѷd VNhZNIxfۂk- sc< jՇKG4h:Tzʺ]tD>aO5lMk/!T3ƎMD0uw5ID*uYXCr(3iSwy9pQ_JNFEnb&2u]>jKFlgihEcYלx.h=Pc]*uNWJH+{L9/9*X~j`溵7̥N:vs)4O3_ُ5]ի'+O%rQLsWeL9'"B| 4 jA @6y*[ʏ n+qWe)c\k9gaSAѸj` <̗@3UO<; N qDCG4uz5/ShQPTSOQL`J>HY:\VEd tVΪ DL(CvMkG=Sapzc C mV6&zKsMt8N})i§!P:sDv^V'kwZgMT}'3HJ7QnP" zJTY2O%Ul @ ,c:mZT fB##`ZuPŴQt0BR:Xw @,fQć8>< f"Y49&fgeU~5^]:e`7':u,S\Y[1ⷕ(5j? |G#.|gzefx8ukB?i|X:#A([:FY;1 j-ϒ,{-O'-P3£e'Q06-ZG%IkTꝘ[[S[w3fٵ)B jhh@#cJM*T䝢MJi~i |3& תcZ4().ˬ*iS>P4ʡ2<¤׹+| .m@AM2ܑ;2Pa -Lw.+0!krK[iڴd}L vТgEKMU vasyƂ:ibiu\C1qUjƊW8{IO ݖQµ=:/@:Eu'f#U DʬᄤrzØ_uW[!zUPQ2m%vχ[cZτB2[ݟig4l[MZ [Fgލ ˸Jϓ^#{?U$1RŗU+L E%4{QŞuV=h{E5 EBSD>aǪ:.ĶTve4vB湁j4RUèR:UxWh>^4k @<9JFD#/0 rThv@)Z<iDɘPB2nav=3z_U|0U.l6Mit}5yoT;2?D;63{;ٍX:Peʥ0Z֋O*0>;]Й__CtE>s`\@N*q2F6hJK ^\ײ59(kNcz*"sr~ Ɩ "Tq[>}"|!n? ^Z]2|jW^+FJ(^ժߺj|@~[Y nqG^_ .+?g:Q؃zwRvD3qP~mLj+UDi4ieg:ފ"ȡ =QԔ@lm0S"v#Ѩ]Tc G*g'?S5s,4 C pOuMaΌ;\c]6MNwBJզXNuЅu,{90@>6㩕O Q[aT ->K?(#jM+z}nޡNm-<|dJ)^2\uL'*:Ҏ)ky(wTtxX_ItOJ8*cʂu+N-{rPA6i $CBn`p2U,F\\^2sAŎu>aqշ8wxTj ԘVsYEvM[t(EQ71Qy-E^S02Oe}ۺ5k=l%3ۢk5脑Fa= qtQ&;HC~%wGl>Hf5*3n~'r#[S#p02ۊl{@'(!f:#gʔwBO0SXV[fRZ%> L;NSfZW~ZJq4pqO5Zl+ Zk:m#qʻJ9BsI1FX snXogU!JvA(ӞIB+w6=T:XG=y\; R(s1f<=2@Ts](/P 6l;P5rR,LYV״ N4[4tN:X29UUN.}E,7>V1f%We,óy^:'CD a9>IτriK HFYl-uԴǩ 6J' Mͣc!=[̑Aˎy,vy#vR$Ss^L(ex{>6TSo)nYڭvu <[ .K~uG[y4%Ti܋2VG S堟 MG{sx;Y+kD OԢ2)ۄxڏQ.UZg,1('8eVX`KNzme[gS3MWyPUKtWgϥJ~ 17T^`d 'xRbI(=:nrpmo/Wdjk 7ef՝S]|DmV4R O4V)T/K9B5hѩikgO5%Nxy]Xcc*K4E܁i"-/$7lȢ@Q(#]|zq,daQ]Zg&ʻƏNXmPzT Q\HBm>(*7_, niGTiyx#De@MdO܁sO"@UO n#YAVoS^oǺy=dmi=q;SLWhewO|dn7mi*$ !5vJ^OiVh> lfuLIkD\xZ#6Hٸiin B^:L8OĆtFѭMq23T+==Jcԩ[ﻗU|p^J޿L/ɡ2{xA{gf@EG0\>9a˪K:Y#FӔs+-kE>\â8rz#D̦Sr\6́Xwq H-czϨTi>Jn\*j%sTܱ}E>J!Jķgc4Rö=IOe\JOY *?YXe?Sm ŸQ#O8o uHBwOG;=p SE6ZV'Z7ڿ̫r[*frD5+:ֹ*6l:x'9O;yu ?VUJUjS1nV5۷wg?lj0NDѬ#pU=Gj٨FBZrMAuC4Ji$D\Ni}z-T65ZV2*2+a麩68QiVgQ>"sRk"AL&*Ew֤s@:V xF$3 lA. hԮѪk /SMJG)O3(z)_$S7JJvO%g\@lڻLN}oq޳z}~~TfSNKS8"&ssP`cIB\G23Nhx!E[ta ,~6ia֕ڮ`jğl=iPlt~OɅ)J gfkS䫙x~}'dZ ݵnPbwڬKNGW0Sƒp,5Kvb^@BgogvӓSmw5jZUxty*bicXV_SCZ^~kxOwz%ժa&qmG谴i vchS mJv> -Leτlے_iĠe:簎FY N~W1q2VWa^#I޷?'o=m]vdB#ll$aooX'=<^ݟ߳ǧ/Oُ,jL=[ڟ)TK/\Q=/M]9 'UN&閫g JۿمʋUTuW0拭nj~A>V>ϟT)%4jUnr@,q2LftB 2d45'SkA5:ըiG)P>^)nwU  >4r $ ٔYk {Y8 sd7.J؟5E ]z#P8z!fT _4l)ԣj+jVJNBJrMRRj:tg/#%%wTP*V$P੘jWsF[v*3Uqg5T2ghpp*t5 WFU].m60]9O%ӻ8&sXvW8eĪ6ʎoC4V AD%T' ȢP7.kPw6Wl ܲV({a䩙` gَ~y>K5_F}@'ŧ}Kyo-?]OG'/Ysf>c|eLϑ> !7~  q脲\isSu> 0_B}E.[9gD.)Á̎$KLiR۞qJd^hV'*o$2%`"S Vj:ݍQCdeqdV;5jX[oSs΍`_Gn7hG캎s 7FĈapwL6AiT3!xI˒kČh OǾ|SqfPs DӁb!kD<q4؉k>W"F7_tQ0Q i꣢ᖪG"@o!o|ܚ#5],"Lpea>zɥj|ef4֎gԍ(F']MOoK]d inmȐ &юMyq@Ore_P, wzY&NtOUjS(CC"ouF\ Ag?(>4V z"SG4LS fs觢>/V%ًXnN+ RkT!W͖Q1]>O F(XoKmi~JFl:l"u m-k:h4Ph`6yf 2uD-LDSIaez.ਚtYhAKg%9ׯD5ѦFڬwB pGX_e//o;4^}܂VKEMRM,~v5=JƉ)}V^t2sӮjeTL.`,5Bg4doa*l}Gg2-:Ӓ&T8ŭ*t)Nt\ch戹nNJ '0z-lوL}.tG⩹|&dBu OA [q=%bg5 2!S{s69Ǫi%FQM}` *a c<pz S l:eRC$jUkEk8^܎[FpCVWihϚi FO{5s_ZZr7K\~m01K/`6EW@ EHwq;zpS2PKt(憋aX'~QQ%S9&Fg%sA4ӈUrnPV'V˧E!s]vL6BqP T6YQmkd=3i{X@Я{=eTeBr0+~Ti> d>m9I^!-?4j1úQ"mKX[mvk6 Dw3g }"~p&h6WqfU!i1: ǩnp[t`ٟrhsn-AZ*:´R.k]uIXnZUk3P{k6C:,V-ӺQH:tҦA7waaXvqqT60 Tk~4#":S,jù~F]7KXnqs`&"KXJ)ucg/P(Cdz`z`/͞MT1~s]/@r *-xcF'w% 5kC1;E@_Z@p!YW.yT:>a,U">%aulp8jqyu$Th~9*ȯLwEڎ=#[B~c jsV25C5$I/FK9xXa_ C搏Y2YHk k񚅏yL!QbX ;)TɹZNPĹjZr̦3ؓwąs:N9@Bn9Ŭs.M=o ]e\h4+=U3P(N e"MFl:,ڀO9 =76QPB!Fzt^=G jiw=!s|=NK IvxRcVKꏪ*Ylkn#6Pe&9uVs{܏Dw ̂{.XOrtw%OxSۙ'w^{NcT67˞9,!zӛKb|SiXB32Ng5V/cɸ]5iZvv/l'Nw M㫑mџuW4Ѹ'H@TWkv7R<KLKxHXw5ָ"i .]ƦsO5Ru_5GT0{1憐-vl(*Ttb)0xpiNPwPThiC|s)=f765x'8O-\ A9ce4xf)aȻX*)9SCEsRI$a'`^ݽj*1GcrUmqZrbiT47gjxl#UVT0C8Ӣ5 Ou{9rٮ4r*6I樄Ջz#IYQR ҡz=C9h门=q9 KaVeMt$ꛃ5#!apo5*ln5nchx˸}+K .szeU?86!cs^C K|psnnm*2eȾ26xFkgT hAiU1vIyק*x%# c(SvB1)j+EFWyW{g/dvBW*Y^iվ[)eT*;aoGs.qxqm-nz.4o:?xZsEUs5nOS^hQ tO*i0N(cIy'c@ آS<W#SPMny#5}ymw~Y*Ns.q19Q b|"dJZ=Hl5(}AN=HGc̹(u9M%*LiRDs cJ@{]@Wg*T)CM3G9&ϦMᎰ993:Pv"Bu0Ak9d`Y iR1ڂ}Ȋ̔D;1⭈s/oUw@ڔ\eBF8UB'ZySk|O Kď }IO]cdm:-T"9&^?Tlq0JnAv)g+m3ncU7|>ypmpSS+JohmkG Y5=Z14x("bUJUh+]T6x-dHkd4%Z.r)Xu2S h)7UzSsog7vzmHTm,> `4zXc5; lxNȠ!"%ǒrUS Xr%,%?[ 2~}-Cռ붅Vs`94N~g۱,6Pj4l.)9aRR(Z@Ca|@U<{"v 182*x8;w~yKOXSd\GQ)6VjԼc\s1RF}sAUw2+X37ѫU5hOd| uR`-3/F7 | Zb*ꅃ,iQBpU W yUl ;lzNUoe%¦q0n2h'\A iq^=BaG v=9k@U:' (b\cR'l+VKU!BQɡVw ~+TӤ7sUDt' Qq4.\1cZp194v"'N֟%OsV6:䱔ǫNg0<,}cy^is`8qb0Ŕ!#%P\ ṖRQf[7㘔ݐdl!T 8u`PB}z/7ʄju9l5ͧV p!qƣ" A&faL$&ytB'eM!"}HQjJ%\٪D"!{dj)5ֺ+vPg-[SuA n6g$pޅvk!ȃ$ϬK6{ksY(QȊm^s;ޘXjK\Rܲ@`1M4)]HAI$=*WYRC+c~}Uz>u_lмz4=c'g_~T._ @!ڷa@"U<^z_~~/@*0G~iF\n_?K_Ex$ ZUY>緈w~X\~z+^6#7~z5z]M0dܭhOAՍ_櫽K/1A .JT^z'Hѷ~VЊGp\Ki>?Uz\"EJIRz1as~'Mz>3юo ryBw6?1 >"[NE_/KYCRJ~?/1;r߸&ehk1;r)*ܗа oWԌ}.\r=JE~RT}ULFW[lsbp&low._EJr+֥~ z?c0{.RV&*[^(Korn>w kftbʑ??g2J5^*TQ%z+ֽ.z GzxQzo_ 4a*fg8éf2#wT|Ns>m!GЊٷϪ*GV R>Wj쎫Uo7/>WtQ]4??rEܹr2C>RpޜD:Bݦfrs2J5ϭJ^%7j.nja$8QF; +fh #>cLEޅ6鸍a ?螧JeIHM+ٔj`b*tB`gHpZ <²MvE87ԇii7gңY8cb2d+Xß iGv} 53m@Cu~/YhXq;???bjU?NvVyG3N& mVsTJ2u>gѝNNs؈<,U1(*yNDU8NI\.j3 OF&my+oqڑݛ@|5;F6mmχգf=JYcxiP.+,E+vlWE:qU3<>*e.Gac/߻ .^ڥC͌?'藔T"yϜN2m֝Oah6W^"9!Ҫ{2y%L"c/@ s JqnqҪ8.Pi)J.eB%qiҖ2=lXoy!Jvmy4\?O?Lyf_&+i9gn!T ,]$ZXtw 4vN_RU<`q63TT*@ͭ2>=?賘YrKhGC yC;D"ݮ3\4Yt!kEt4D*Ǣmlf!H2ҾOmW3Lh7+;z=L:̱حFf}fplhHW(Rp=L`Oa2vوe(  >|eN9)%Snoq}״"=&]0*ovK;Lg̳rwmԽ[]]"hT[Sa|%{bDlıF!)vy/>6t32j#+LmZ=c&F.[9vR&p`U [x8579;s1Q)q803cb.7n|SfmyBHIv?0PusNjq8bQC/2Q<\7=sn[+5ۙrQ:tvE2}c=೩Z_/IkYQ0ޥ⧺tY,c5<J4jzjp0į< 6oLx,E\;\y0#U3Xf2:/?ybs"@W5 -BPȘ$;P)pep+-@E ۱h ]616_,*s0}MojDbKTP/RlN!8rNEXS }}u,u7^w/>#a-x0z DG&_hX`AaP+؋|G!gh;AWAn&sKPҍq*%q:ypzvs@Ydv b\S?8Gt,ra g#TBcؼkOc˯HrIGf#LLJd,N<gTBsWCA0ےqX_?y1|\3PZg?f9\yGtCL6q:',& -ܼ{pmy{\ARk<^Sľ(HK-qRg107R> IЋ^[+\שjHkIot@-⧴7V,*9 R\UyTVuaBbƿ1 dQ`v= @WX3RSٙAkXZ~IZ*4a.iC6T,wlJnGDKu.j+FRlg3r;.mSyx{WgQ*Yt{1op0/b3M>YMj:-ffA3 p#_qa!-kCSLxyByn#Ek~. ⏴ ֈ RnOyxM w͌0ΰ[ܭB%ׂ__B&x..lDJDZh[l +5F:x;K+ʲ\yW_@?ISo[i<ՀܤWvFe? جck-وj([ݿf06]I/]dUT&8/4/_3+2GRj*AM O9bur*),% o}L20~5L}(~xِ!8l܋ßxfʍbt5ůKK=ԼjnmdDϢT3ݧDmLΑlMu2W*0Dqc%a" iEi¸Ut0A/vcEyx@} *ΪQ^ﳬWsbdNC=EFf5xmoȔCWib(4ЧZ[փ0)FA+t>ʽyN1lƯ>0x8gAl+lA2FI=C(3G1S48?쳵 1!g-j7wnT,rlN u㙀#_ beĪ`e-?t;xoq^[7y*srnR13kOeG ەKsDG0TgIm&"*.|x]y{҅즾"&X :F Z;/ZVrusx/%zwV[{5^E0PTo]{]|LLM  חS  6(~g1*ҹpmE _SpZ{և-5 ¿ܯ; d9Έk$i TR&~ȣCL;1u ݹv`! \c~%{u}1^2͐5VƢKuMJ)PցcE bX_%3^<ŠeQniGyJW_ΊbcgCg'Ɇ8\UJpz,C/ ;ڊe*p}eԷ s̭l0*Rw (J2V7dSY,\^D,YcS^^cP@n@ l?(ljivþ[RTcm,x C qz^ h5)okѴL@lk^pi03sfR!W{ruE0޽P7WF*pFS' qa#kɰ }u'T 2rxѾV /5i7Ra"ԦkSn DS @ߺjNf >=̅vfeԍƱ(]g/i\Ӌz@}ӦXiP&N 3Gr7v-h;AA08Het;@TldM7Ѯ8Vhׇ"t-_.#e PglqAvR~IV|_JU)wmF"rU*B(ڷ v `j)?Kf:+R˗P*>ڞX & iӨyzAn?L*vʺq:g!ZbF+Jx9eLьF``R\',xҜ% k^ 4,j8L3r8(b(d̥]UgT/E.AX7X< L¹8z] >&Өyc/u?h5SD#x.+y`tvB4 %{,”fǥC} ܣG ԏo X$wgYeCPa)D!pP=H !P_(w:x״Vf%F_/(J> 0 X0Aw {17drn򍣱8N58or=C\eTs\F*.iZ/""NCƠ6b=Y fFp'_1Fa,~^ѹvL%NZ;Ҹj6^Y]NKQ9jZy .[;i euʸf0S߿IN{KCg\(-~ fA`9ka.`:]bRl~nMw^ n)*,/@rA^rGJ"SwXT&Qi?X9d\zeDzE@濾 ™)uɴ=_i\;]>k#~7,b75tStpyMn%OG3⢪y~hLcah3veÈnV꾦0/y}fwf h{`r{ΜRq&HMɸ?|i4(, saky4 =LX`10̂^|GBM;(3O,Mnd,C|)4VD3l@v!`(')il"++(cXcJۓ,z@ Mw!RَCh9J- ϓlP|[.lxD(XcP!Z A* -;G1*̳Es+̶yS=:"ʻLjoe(#b[ ϼn:Nx(cWeK^)=9v 阱YU{͙;R=it%İw1&˃L+e nkcɺ5W_{]X73E)cNO[?׬~y3]pP9v')@P܍+y:L}v_ R9iCغ1l&MP9 3.ؖO0ipo;u12ͣ-l4W4Dte_aX#>ޱQ!YW{JþiN7eȩ2 \P]XI\ y[Z#s{KmM)%<ܬumf:Fߙ7uh&(RPlr8͜)rۆe&<`Ru6cY@]piX`!RH֎ =P;n=\3" AH\5'M GP*pF"_Dv8ʑ 59w6Cv uc`N,eBݢlS>`-zSG0+US_`INk(O#[X5n58|T[2蘜)> Rl*FI>:]zinwOMu_-R՘8 LZ4S(̞|J^eC8ԶMj80 kb:#䍥t?x Mijz"y4)RKHŦ\T2X4#ڱ݄۳n,'vY"mx5 ;+(}k ˿lJpo#/(lUQp8S5NC/-76j%+Lps0iS i:<@h}UӞbĠ~߃aQ7#*v}}7GGGQ$X֥reۥuqC1ʧk/X7pJuVo3Ji%FפZ:VR 8Vdeu-o=k@-.E~D~9j 2ccT2+fR-*myqnLeTEn9˔,5b] N(Q 2ݞXmxNqf/-WT3,LQa|.:-Ws&1BJKUUQu/ Xqc̰ K(uf UkHL~ "YYƷaSa3jÒn3908#_X4@5α}pt; q *m\AZ8r2ƫEهR> pza=75{Ģ2W*:@Z KoX%k~􉉧P@5RI 6UET_vN{7dfk_!N9%PY7ۤu!PSTBUroA%-H"2XVGIsIQP|п rB5.p}`!riC&n tKzL˘DF55+;O러DYElA -grl= 91Q+.%JSS2ӯՌ/^1!􂥀ٞ+hc^Ng>pdu9:M37+5g,EEkAy*lFQnZ~C}!WpwU7*k f6fQQjȝ扺:ڟȧ l kmAp@fC#ˡLqs/$;uu; ^S7+M:Cдۊ ;&ic.CgMU}'\!{%z5X:55\CQTW8fk>0  w,yB9<v1{ܪb6ǴW\thP c|z |6|qqgM9V:B ?FT9ה(7G N8 iw rx! jϡ(ݙxX2ʶcK5}i=u5Cs+Eqhj׈EU_)!_mK xb珙@r̍l2{9C,T(r\rCb\S'J[';K#؊!*UŠ jcuD5)ia*c̷muoAd`@k Kb1קoN`DR¸ , zb%骻AϤtλ4Q,|acEyuYֹ-|A9/iV^jTF/LOvpOc9C=?빉Pfrzw Ɋ<Z.˗>ފfM;Fh3V% P',-+a] KgDZZVq9D+f=f RK|7 1eQGwt9T'i˓ԏ ^O9Qt0 Jĕkkc:>[~q Dh<=&`{dJΠ BxWPt%T~{i(vnPF 5KC"H2\Wi'ٹ|8P 3 ĸ1$t} P3g.%{n{ -*k$OA<@k^{VF rŴo=wίhT 2v)itsrK)0RQ&̩[tg\k YmZ "gdIPљiӣ=}fgl8l}c.ǘ4*`fj\/ъAg̤pO35B_2G7QsʨhRၥI{푱at\{h![LJhcu>D׵L@3([|Fq<ĺq4#L^jA8}9#ѷA% 1K_V`菊j`JsZ, n5|DX t.}%ٜK^ p7+2TPjffn&b..`HԏJ6kI*hVSe+j>ePɖW (prCo8W"8"eXΐYMaˮE5\0QX#q1A/!~ϛ"BfAyc]!N ^ٍve(YA3CvьMҔ\4 i8HNPeLk^1,a. =!'V2r^XK֫`!*AǼ!u# -:@׹ Z_׏JE%1S-naNefdK;ezd 4')6;Wr˅)Sw٨ w {JuxB/fL=Jd2ڰဿq*1ײj0lX:jMޥ59=a_~%x/`=vf]VWY`VHhkTy?ݽo +,D6Kޘ:Bڕ ‚S$93k7WU|fiD5|2ߔGRӻvw !2'<WEo{lK!hvz[e1}fo~¥3 _1S_蓏*mR:Vp8qvo@ 8CVK@;$ eBs̛\AC #K䎉fU>&p*X/tpD\"X Jd,2e h-W0ֹW0-!;%.D!uԱo Geѭӓ MRPe/*%̴u89Umt'U*avC]ʼn6}JB GKU+xtu_tE,Wԕ_N1Tq1T}?NOw0  WlPKtK!*]He|ͽGl)cMÓ")KŐzEas>*O:h4Bg+/ DhlٰykY̺f.HJAH9rDWq2tb6Se۪K*,gC@%U0tWȻAD|Gg˒0qbY VZ0:V]Z")}Y4ltNyqQM13D 9ZΦbG&nҠÉg\$p3&%UI}J8ޓ( N}ҙfIc26w@z+S2sc,?Xg^c$FIqUl {_&Tͷ̺z۪_އ+D1b^gCpA\m>]+1`R_T ;sa IZ|[9s1"n 5CJ(auqݭP:O4.s+=rgЎ2v< HPn/O8sbA|0<%8^FtA9԰Zw4*s,r8@ hpgBU+o7NEq @w ѾoNd\W2oa0sI"'Ml)S30f5o,nQLBk84:i)LxodPkg\ L/wK./t k}BV%7kC7eÇ$h! xGZ6 &z?+LcgrԩUӴLgC1sӦmdh:N'W~-LLsT|z%-X9h[|%%Ur)Q:l+͗|CQC qRx; tJ 4_Y͘|fep;@ KcL '|z852w& 0&QڽY(EI#7ˈ[K Կ|c?R4ωe3)G Y2q) ?XVx|L`QMavMҀ YQX%mpg3M`x J+oˊ-0_k?_n:1-# n*vjs/jxO[U :_R{C|S5IpV E[ tDxALZ22FAܹ§KPv,B FNVp?E}f$;L; K_\w۲sS>n:^W3DsMʲ/OB$Zt=|L"~Yrkl@BO p1R¾H0M ڨA+mħ荆qJPz8PJ@59" C QIT6yn.v#\φmmQj ^s/Ky#*"!6C |J&֘2%ӈeK%0y[x5ᙅK"!^b2Z2i˙˟ e)Eݯeqȥ5j,3]lVN2ƫ5bo11u)~gz~"r_I{( =sɃT{˼u~)9g%.XE)Lf |XI~R3/WI!=#)):J^.Zqܑ[@aGfiޏUYvNndE䧳LTTJ^qPY%'A8b㼩WM?1%V^.Q%{mpSC;#0$ƈWmTˑ5.ԡ7QU~R,\6R!:]A V @x=}fơLy#=pSW A&h>LQ,VY"u9[v)̣j2xf'a'Zy23#v} ݡs"l=m9(&%q*VT N%x[~&%~Sx%x X[^#ӹz̫S4dv:CGPy마NYНۀ.VVD`r }%z~ ahr1b ~GRSvJݵs2>t2q՗z9ˎ𣈬ll_(JzrӨ>!R2z03gY>Ъ]61.5Xeݩ(/ӈZΦK7q,Ctu3 hQn}2)eJTfp)Й+x#3_fMƷeU{%T;·B!$3bkhr{w+'94d]NAx=j S+* m9-`u8Ҿ%Er|D5 O̓){]0T8@<[F?5@r⯷]"-X!O \`b5Y6lZc`&rbanK?@FrE a[D6_M4 JLq3+yQy=">龡+'%G70j*eHs=X2Y]~ muYH}‹28^Sb8QT՟hL#x9Zxk~!¨|\6$p }%7tK' ̎YU3rzO6Jۙ3P ү@f YY޸] a2uC,hs ;,wcmcP}cNC A0[qK?(bYiy 8G8{xѴ+ߙ*@k6^N\cqEB0 ![4švKWvPR\b-WB}Eܶu<ǰO*O7ַ+fvؖ0 FkSb^ }n͋ףFN5dPNY(/-VT#oըfImn6Ʀd@B_ISў+:Ni]Z2^.`Kex?)J2uM%b yFXf%D g)k1!jwGO*usn7S2wI@ʢ1fs 2RJ6R 7)?Mbfzn'L"&E 1'z!R^FK!}Iӊ ]bZ@Jt$6^.v3+ PIJ)2FsYi+OUVMyDK_dǰˡq9Кa va-de`ģ|Z:=yo`Xw1緘δJqYr^4|t乱G,X)X,S \PGb_I`cv&H9%V `&6ʼ so꒩ 1CAo,))>,Ez-OA g0] aK֞GШ1yd[T6&؏q ,t`G_&s19Pg!'7 9 fwg3I.eDc>F)ΉLWY Y4xL &,m2weN!ĽgTh仪LqZMd&Xad\1243+q 0H)}QJE* Fk"G>m>ӓa!"{ѝ}i  pLXQ\< /o߿݈L 1 !y 6`dӻ2nX)^,ZU2F'2q}rTAaVUNжYje\=]fFYږ\/3$}.Yr.kCNh ;Z~k%-.;!Ny"}W&#y.s(=/wؘ>VO#jiЇ 6kTE ETu4,}AGXo/uܕ9+r8+ a!eM1$p0>"Pe F\g_5QQN;C [m3 ee_pg‡SZ|[?dfVl8-mJ{2G\־g'mSM;1M%ҹz' Omkn`/ӆ?D:ԡCyͺd -f/Qx_aZuJi.WlDR=/*EcnTL}!=WXoC|-v3u9k5"y&|X,\;8`<,GCkDܨ)v@XfEL-"\*s([߫}gC2E%:*3iPX9ݶf27ԗ6@4*c!ŘOɣa _s+8秆i<6|\(y_)a{,'78;[ O{\4.(BQ<!/oio.ml)2b!q d˿MbfT0j[he]hyD`/3w~I\^(&,ےÇ'C QLЩӿĻe<& `1/ PwQp=YY|ŝa1nZa~sh혠IkԻW YYu\J.uqa).A+sS)Z"7u.!ʤ y{Lyt˭}Z>k)2:21yu7K;83Ծrena4zz1s/G$̻Q#AX%te}"Ef51s3[T/j n6n*g5*+iNP.Q" ʳGf;Fgf6Z7*ӯM!y6d37K0JƠVvgkk kgK5\R]w2_s,}Ha5V(ԀYyS%_ MIB=XP,p'#l%QuSN}FTZ`A\p}s*TɆdٿYf<ޮekΆ$AϼQa)_R`~ܰ5gl0 0=_V(^@|Po3,s}HfWf}|Yf!~ l#"eHg蜲AkѦ98B2D+R et3p2IbSal˼04#7qljۼ@}%,m^rÔ5 ^&pkњFJ0@ -_/_#Gt=ߥ&*z&~Wa\תvj[3|g'̹,a73ƜC:u3b[q&P;3n٤~f4l#h0;eN$.\׭Ds_nI[L($S#jG1:e[[a :Mߍ9sKG1fJ"T/H?TJeB:lqIbgpWl>pd9VѿD 0FZ\8L_iOyuTՇٍ̽`c6#ӬxhT~s-Yؕq3n2J`M3*?b }#GeVyFof<.0˩YcÈ0[N"Id9뼭urK.˲[)A=Rs9Mq2#|ͿO7~ʂh1_mk#~o05 D6Q+_8h0TA8~!In߹+sWJt:mg}љ=xVfe{9N_Z0a!b2uO/CzMl+c*Xbά>@ĥ ÝJ+ҏ~p[s 5pY͉OɹQƊ 1fj|Ҍ bz<.;:Cp8<ǡQCmjqL ܱKG J/B+W2x3e uw5'ZY@;ObGAylz˸oG,_+MarV|E f\}e}B,tg>'򏥱{YC@;OCs&q(Yٹlʬ@]IONv2^j , ^tF%/JO-6F]JD!n5۔5̱ooS,_ghe;ןJN zX%t/֏T$+g/zCP5V xP/5wpVj㯩o512]-#\6u0BqWXegEom|EuS"J-Z&k/gLp2ZMYNs,F: r/w}6l[xunU6D62#]R-n+"bS۷SϠg៊4CJl_ Ŷ*=SV#|33"9pva7U&,-JjncG\5GflXruE 0[ \<^~YSe-qybX5/\?tI*o\X*efؿB[]#dOywNgR#Srňs%0?(, M3W_i|c]H;} PsSj$|ɣm+ y>ψSs(:z;j=G5yFTZ^ 0WxYL@PTIR3#oWaGy'b.\\cޅMS\0S1pŸ~?lLfX/?eMw ߂ozhu#g`z.,0ј NB 0f<0S~+@nj癖J'i+9en1:fjcё0Sƥ-L{Dys. vzJW̪Z2M2hG&68F,hseCGej]89V]X[2E~_> 2؜dr]B ۼx)܃Ŷ:ܲ ecĢi{\G<aIȕ/+q.q*N!O"V$1}Y7>VܚeGcuocBV0jwcmh,By16Uvv< 5BXLy)ĽVx' Fr0XwaN.20vgcQc4H>&ǙytKH/_2,]#-ݼ̉g)Ei71w4Gn)_ $J޻MGNf-SamڋE3NqU>鈀PƥF-i=EvVF9-U<7*~Ia-ku21O ڐs#Z5qc1B+9n1@=tӆqʩAyw,ZÍ`+os)"/%@ Jή%^L6/yA,iT&rSrwܥGgICבLOgfg&0E d4;Sg2@5fSAD:Wyn zX#y~U7f̸X1G\U{k7=$vCATDw7q1ü;1 T'RX0;S>^.' %pZ>aH:i*q?}=Q J.:J(OCܔf#I 0TKHY۩-e; ިL՗A7.We20tv:ٝ~g0*ߖZ1!LqiC.gb+70p`UjGh9G[ q:yr,d|Զ 㙞 ?=sq&Y=>O-n0JaADj~uMIk.vDsS&Еq閇0Ic^f)q^f9B\fu q 1+܉TvJ{&r`擴^U~s0Bf nTEEwԵ2;/~p:]~Rƻ"#:OKq^%s Tc{ʰ~띟SЇiv G9obvche\3;@-n=#_g̹Ri)}=ƍP_!uNn{[KW|JHXv1)fC]MGw ޠ]ǫ|1С\,xX]cu /9Q; v2qb^%Sat);3Me|Cr9*a`3^nwz.g̥qԮӡLG? LRcge8άܢVxR/y%<}Q==Z6c楛8L<@ k=c9 %)%Ǧ'3~ g>˧4coO9"s7ـvXwgt#ɡFJҳF0І2C9d¬wjAϏI2B.(+)Ybbnc+3G0X;ͣN"`]7ܞݝLЅ(>DžʶQ{NjS]S]B:4x%,U0:Ϣ>=KVF}陔X+ai3a7yvZ4 )ڃ[NDwO^~#ѣP2:Kc-k.Ȋ^k%2\˄szjLL1&qqʻ3dt;MADP+c{jh ^&Z^ +%{âh-bծe~#5[PFcB2-0%n2{LѼb,~aSqȫU6ī6ʋQϻO>'N% '1y}<¼L!bopsb\?hz4WzQb{ b'<]#kUQz9+-+Q! :1MOE\el_&wYMGImOV?812zƹ'ȷgg3LՓ` 5R"7틤.`NS%ʆ=9 wrzns9ۜ_[/iVcF-oXm0MR <>Dqǎ=Ftxϙ]5s^~gs4+7RmPV~jq (P+<1#0Q vD܎J`ZV{5I5h/9$sQ6(;kSdeivJQbssmi~c8Ks~==" L32cxr-LLgN`2Ѭ0)JDt}*d.Rs0q]%T-hJ tjLtOi nvo:6A/Z8Snڥ=xiޱ7,F 8gZICلmKY܍PܳWȈM!ꌚ/1q^;9h= SQ̒R,NW~T6޾ed1~]x: +]438=Ǡw cQTucqb2[zIk˰B.D37(wvy*3g/K9xeJ; zY{J[*.)^%~Ck8ZW%刱m3!0CX/LOĻ,>J}(J%Td~H}ѱCf/a'S vm GInw bs-ǟS!d۝RG>5Īnc!vaX xYUJu\o-;17K{x/ea}NJLf lN*4#| ]f_uwswĿ 3Vlpg΃!B4%,Of}7 Ny%ܓwR/ѯ^/5 jsң屪˞ۨ$+2f]RN/9~+PN*@sLĠ~Bʀʥ ݾƥYUoZ5,6TgUBW jU[۟H*Ú0 &.1p#ڠWAiB]~0}+db ]vEgh):͈I(`|?tcQXDLwgNɨzae>S718w h}Y3T W:?Ƭ{況}0lpao$t6C6}fW_9DIg^LgP S!h?skZ `{|5P{ m!g:"MLC= Y}q+SF22R(Ļōyo"&Ne|+-7a5 48ӿ_ގe GH˸+q`'Bgz6{~=XY 'tD=1OdN7AeTR'3^YdNDy&} z-S Д3{eG {"[.o;QxkT#fMf+g0|*KXās^Cܖ3~bm { ]=u8z)ܠh+1]]ݏGc(ؚ51@iLx0f\RseHɁ͞bX㈻"Z;$X(:89O;b/>}'髠> $-%TuY#MXbep pFpƝb")<^.eq ON\)}!3p {:VX Por=4Q؎=As X(^3<̪L,}ĵ% `勸<@z^2Ma(-lG׫X'BVu/XqT[IQ0&UNV#/f$kxzzkIe724yzo2Wa4@ vt c$hY9ε7%w% Aٞ`H8'()%u3.~4py;.Sxk՗fSobѤ̡ 萯 (gs˙YzC lp@daLYnk{f1vnK\YL:K) *K&,ρ~gmR/KdQ=WчCT\Ks238vp#Ib-c>P@a&h%Fos~"s"Am1L#,D𘾒`do&fqQA4I{ē$*@Uڍ8qK[-j z mtq+0flHg9pL )݄v`c>#<х}@p%ncџi"T1+u.+d^*P1n}y#SL9G_eu=݋1}aD>X6:[;YzOОIwl;AHxرQ-r ӇYi3Lt^20E3znsL0i[rvwNc}7\Tj|ٽ ǥ?2S!2ƙ]#fK^BO1,KJ}-W傰xJ1 秴/i 0Y-}3-(h-N_.W!ZU0UҎ(s,50u3(ryl k? ; {,AؤyW_>wij .k hNҜe8/ܞDvҷ/ C6 e1h/T2n!0_ޑj"*ܗ*-Tfx+`-7D.w*_Ϫߤ12ޥ9ͶOy 89yv &&q4b"wR%Lm_7U(2  2F̺AvT8ZT`)92ֶ̠w 7r].XPm79A3"~浘 }(<#['M X-̙U3Q^дP Sh9>m|])5JhTuIPY-& s49ΦLŲ yi/c%C}fZzz2_K=cu7Eyֽ1Ms;&,[UQl-o%T؆ ;J`'0pԾ:Jke{TJ;s`5(<] J ߈^wV%<{\ʧG \2|Jo%f(c~&r,W5Oiǘ`M Ûo%h !pA(`1 (CEch,_i5Lc8O*Q`M B 2BW(Uz$0&w˳` 8 k3EH\u1̫W FڎT蟑ƫ76Bf~4kq5ƿmX4jR`)rU&Kd5_hMlalLD&~pr$zj8g%z Me\eFFiӏFopz03]3ǥF %PJi/^s|"4Q.bb.1V1j1rCzЍ5g)}a'I1ݧ[L0 `T,};jf#; *uC*5`ORܑuҥ, 5X(s(1}Kg@QPSL/\Muxj%fbtnKჸktXQaW `)!_ 'e@%#6pfd,'w-"* uIpaMv*rf.9"eB5ǰu! _J+9^&FS̽4>/?J>Y|jjcY(pMosuj˴"<: }BzMO1ǯ2#؏} <z&Sy(VJ踄N%WWĪg!JMs9y#pplЕ¿4dǙm۴\Kc)xU@(F>C\xZjc2AcW-t3aќG.9EW 5 3bi2rܻTeeЗJXBU60"Kݱ`?/7>҈"1sZu{nb wipt{ʑ1Nc*Ѡ/rfE=b^950 ]Y* pq@}e${"QI4cw#~ص~7rfQeoDNefjԫ~!cN.fp_F%W9fs*gʢ@:Mn Ö9>!Zi_AwsQJ3̊Cz0Lquwc$ aԮ '%̹ ֮Q` XLjJ?;1)rzX_2^&&aRu#hwacΌD,%Ĭ-aB;-L Ś#E"d^gmeu"Ըe[M'0uVBb& (Oe|&ԮRW-~7%M,;8`:pJgH:8\ L5u Hu5ƒGJ+vk{RbQK{eCw@^(Ꙧ9D}e&ۇg9ɂ:!G0)f}.{C tq.4@0g zF0f;.C`Qn7:FRʲl ` ΄]Mb--2Hx"NyaUѬq3pwCi~!fsG63E{ť="[~AzNO5D 8*ݟ cKs=<Hi L_ڗ >`Z\ 90Q.C<AN8 4@rcw_lu,>{?M"򹒍x ͬJ2a3̵,Û,S"YmJQ+3SҽN`9ԋ1hz*i2fxf(&ebpT?V X>f/<]j1^#JFGBmn1`.U #k *?Pmx<gH7C ʷj26_#3D_KF8f#cҨ'hdI^TJb1 v&&Z͕ ip5;tPTXqslFi#9jg̨w&& d{N!sewk~_$ Ji#YrPUb;f"/Kc9i z5Z=<55ʢl @|,3*>铄,9L$! ϡkѿ]>5˛Hϣ~YjQ1T?uۊdڹ2^͗bc"S''dNMxA͹ _hP|qEcPlT jϰe!.-;yҭi ]r\Acۜ 5g&Suh>uc3}jvBb`.zi G702gP̪bY/=QLh0 d:KkY|3cI!\j+2,GXvc$!8^J:A^3:B5DrK<2=BPXJC8RрIoG'P eep]y _#9=C߬2F 6i J.g/Emr>:=(ff,tZ,C3ZoE)I@~4@fHĢOX8|DBɾrDe_X9 7! CC}>"TR@ Yߢ\KĪ7d73}bw>YFRS*(=sa}a8lje}XR`7M9/Ҩ_yd..k d֘ W^0,%J :_XY`C/A+Sp34#Bimeqś*醥0yk:53b* ǘ=Jv2GieR$\QS}c(ێ,`)ŸH<@nJV(.sA4u7٣B\oLs*THzWTR\3} Lښh|]hk,4*k>hNg EKzi~Ie@W@/Ó@`gcѴJL ~\lZmm[򩢧91;s;u(-re^]18"WA,oi%·lF+zv\T,J0I>JNJc9%WZ=22*ehZE=AUxQɿi}Y\E!$վT'DycmpVly3( FKt40bS<:* 535 ]~[:8=>2Lec^ԩY|76^)1u/ "WT*.UI% o3 rɛ9f~Rnl6̪-;N{6&-Q>%t59 =SD} =35aAy5*Bh+mX7q/.:S8 n[9oPw+טp)`]_3c7)'9әUܰWŌh!f&G"^L2ZhДMZD:y!:?4Z[{gK~9 v>O%7|K_XGYYLCO}X=e*!1AQaq 0@P?/TR ҿ}B.\ ./B.\r.\HAGqcYKH:8Z- ^ r˗/+.\peƢtr/~\aU@˗t(0Eȸ.C \(HL :?rѹr˗`˗\ yʉ 8?ĹqK?}<=1[3=P@:J333a!/_\r_K.__a6 uG=02/aHtFTRt*S:$r˗*ErTJ+ tW򨒥tN+RJRQb躃^eAX._*T}oYR W&[`UiQCfU|dNL#Ѓܹ}.\}_}/*J\qbqܸ *U[]ZS]@5T*TAper˃.?Q%tu ~@6&B1s_n\Yrˋ/(zBTI_en-q!c ҄_*T\/*TR}n\}n\Yrآ+miHrM%JTQ%u*.qt#lhtCۡab=0ƒK#$u\2չrοJ+/qa O)D8f՟~~%uRu.\} _CsGAp /u]rUꌾ K.\Zҥtj$a0^J`Dfaf#% ;ch[$]#|oM*!P%J/lJJ+e1_ٔʘrԯ}.\r\}*T+ + l|B涞 2#ԓ.5io_edͶYr˗JRq ˗._\QǢ e"G>BQ(*$p`˗.}C*;.\r\}jWa:nΠbi{x* +u/K -(,wF* ވLڮMsX[n\r.\rR'f/6XvkLJ< n㺪eOF A* M\a2X\uV+d4U!"`w2yFXZLPr .CGt%n;]*B `[ekIܴ(䟿撿rӹ0z;T0Q8HTAi,splXUo+L`Bj|ےpӎ^vAUTUr@ ݩxF- ҉6X`JVt*422tr˗_r J*_NB|gh6 fc ANs%$5& 2]FE\`awt$K.`=//]W{rXIPHͰ]/l h4גBlHƱ#߿ig~0)#Q^B}~/1~isX(ϘK? O)׼ǩœ| F":_*W0e9꧜ف!NsSUKB:0 !JoRv00B#< dSxKY/d`y Ц^‹c^/ e88)Ef!PӃ{0I9!npHW .,y贝0rǟ5r` P{ioPw8B4)ōj*k+LvS2riʆT =o3#Vf ;+ODA, B!n=Ņxq9MBϓ=cf%n7-o%[*ϙV Ru}@+E?\rѹUnTJDK1V9F$yQ! ՕbdaR8+V-)9KHHW>yuhw$%5˗._r.\W*\+}vEzd(l0(LZ>_T0A r|m,%Zd:Jk#s{|22%baizIf; R[*r9>0m5V򚔆C~f YuӮ 7 BhrԿ.\rɨof);`S# " X' 1.Xq ]S!qr#?SdHrX^ ̏:5QtuS&Gi/v/99IsXr?/B082Y_Әjg?Ws[O*j]460T/:=e4XX-yJIv쵨<̮r I<1K˿}wLx)P!osAոm;:QPJ)^׵6OD*& $NJ_E4^ ljRҹr$RPG"DzcUn W]JRU[J*7 q!k~JU7.ԫFv)ح@%괡_.;3yY7CՓ.J pS~KCβJݝ[ Bo_*TIRWST\z,<Ϳ;#i}=o+o"L*Y\ޞC-eeuIyW*j& cvsA-koG ZP&>.(14ODVwwwzV>+ @ ү?#P ZDHqV2ŔN!S-_C8xn$_w.\_wy`堨5SP=[]Uhߨ+%Iʺ!^d"-r׹R=Ve}t#TWM} Ǚj'kLPY1P1 #c{j0{B|l2s0e?Tf4ypwFoʵ 65D"_ '%. C@mcǍy~&mR{ 7J/H N_rr%J#/^zmm^4Lny%Q3''=8MAoklJFNפx-e˭+R[Ÿ.Uq 7.\"UmIP`̺֋UAZ&K8! |sr'ĻN**WG~0UxCoAMrR)Q 1ejleϭTZ\ L RԡV( %r7(򐥗FJk&ْ+l;nT "^6Al? QEB7i3NU z Jkk0ч㔙f}X=+E| =Cf⮾kC?5JӰnJM$q 77)RV9Db;g{3jø ^ M!{J&ST8b0|}n\KrkM ᙪg`N%fi)nX\Bh*~An?79^,雔? 2˵`f+ 5Pmu;[26aB[*z2,!MI2p9WBmc,tzWr˗/L[)+ZvsCڊٓ:EXJcjnݔCrw _L܌ؽH)ݧ=@;pU/%QvS98㜭q?wrSO8L]aY#;wѥKiK)c7)c@L"/[1JXP`h[y!EboeZqO{C#\_._/(4rRJ3ʸ"9b5De]l"I))?v2ª#bu+'E`⇵k{ݒ)ǿ`BW9\%V, ݕajk*O*WQ̹D׶:G{D|Lcx| |Z u;yϠQ)2❘V0Z%>' ¥={9 J^rl!,!߸2, JN-`!b`D[QYvcf`2`)q77gSjB6>!ݼk˗/r%]*k#rf7ӈ4$57,`B4#*e)G?,1̺"ިʇSk-tTݨX hEs%%==u<8c;y$Iڢ_O37`|& *u}HTj;?bielU<dHcC,sD$&651ah-`ERJjXh` 2_D}Zu> Ha# ʸƔ6?}ʘG$KKK< r,1 r^2 1â!:^qm̗zi~!ME{=OVHv2Y._}oQ2T66R^}H?~lEk9S }eƕ=O If¢0bXb7sv<.\zy"C5olLD\!6 Kr}o\r+øj&ʣbxZr66.0h#M`킣UNQ,9 7ZEZhri-,QU-_!b_r~LEN% Y8Vˢ\˗/w-eD dv>'iዀ6"C=/SP3Co?ք׾n6'{A*ntf.ܼfg[h4ݧ?n豎!}X4Z!)J7fdg3pc+}ށTN8M{d0n䲂csΧm>"[Sߊ.ϧw r˗._J7/tPiI_ ɦxc d5c0O5Jk$BVe!M[FwׄX1 bCzeӫ@څa pd瑗y.pT>Q9H˓u}ru/jW}.\˗/_[k r ÿc3d0EoUKh0֯f*|b:neоY ʚQMN#ZbJ||b#1!b#IYb8a&£0Uo!J^7zG+"+B1vP/Sʌ_JVRY~S+vy!TiBɸ|[2%Ɏ^ kJ;[C;pTX:ZWbam4ps}53c2g[xh0(bP+g>f~#j3m Y2BT1Wous<L4~Xu *~be˗/\"i&)㝻9_k'e3l-a8/ A._^,&wRHϘ+ z3b@LTz&^G{jig鸎ځM/o+3 ԳRrd4BM :nI++Њ+~(D,)\Gn<:e1w8~f7\2}La_bm5 U8#22OUM=хJ*A.U{\+-ێ5L?.L9mJ0TJw?WՍ]^UM^Xy{6(w," Ev ;QB|`X7 ֘|{JzW|2W.\r ev~ }?|:Vb<8}~*)q Rӽivj $2pTh @}e0w~=Ȥso!SyvCq~? ," v{':#&jr Q}3c6 6'j-MرSJR&tt<^RT>gץ -GDqKƭЪqUqy !pʏ"ݱk{Q5,?f~茗U]߃xbݯ!T U?uL7LI!n$ϯw/u@sɳFpb:(ٜ&v<-!Ji5]ahj|NpqG7īf-gzWXp.FX}TD嵍/\< lwA-R]ц7NX- i>%tB!rls%;.rRJ޵*_|U$WoJ])yM3cOi8BG &)t"J6K* ^pk$%+Px}Pp+ T&IQpς ئ$c+i v0Ơ.܌bW!Zq4y!YrC"b6VHJsv\?BG3v<6b ؀g!|0GXɹ&\  C?8drSl͓fGo$e)cZ5'ҩO M[RRw3ȳ+x7e8EsaqUeFEApJE$Ko }o$z*hsk|gs(J.f!E ὏ `,yC}J"ӤKxNA.=]мɦTӲv34=(+ T1k)L`XR* (+[_c3%UJ][%s+?ځ t(4MɬnMe=̻@a@Ƽt T's>F½ɞ?L.U}FE[%y8T NrDB QH @m$ +xȕZ3@f"3 &11T31&tX#ζ4d 5/C"i UKɖ/0P_{k _o,8y^1nx˲1f,u+VDt5߀)C@<:g`pBVicАѷ]ku._lwe[""mƛȝcpg[}(S`ԯn&N3ɃFgusf#0eqY 0 NnuƬ^Y9 P( lL@N Өy8M<8pJMn 036S4\9u-g CWkU˗._KЮ͙0mCE %ak+>M]1~bs,vCΎ dx=8LZ00ො^o |=تUDϴ^)7V\^Sұm (> bjŽ3 1k'#Y2'FNQP9P'?&zԩ_.=YkіU#s_H&L(szf3ty5`'Qs4$4ld|cU+w F1Qq3ev0KȌn y ZjF>X{ X5EuƠ/wHnͷK7%$U33U|f6cPaM{ƹn)7Vþ+/m`3N tݚs~ N`?*`rU'(Y.Ԥ1#،aYGm) X%,j*v~ێNQ{[W(!| Ǭ Co5W1Q3HsX<]AgT6?K*OOWm!23ow=NB/CR[%L_ aB˖K",Dl4oX}Ec{+NC_ <}#цH22ҊPB EW )X 1b X- XD p3v"adv%j ¹ca-Kޣl;Wg/ F/Y&Wj+8lßlq2N*YN0_7‚lՙ> ͋!&צ,[L R>Pq%I/P'ǤXc0I?셞*/aXA PŶ im>J4PdZŔpOP[,߄Y@+UeeIYbSukQ3`e#Cp4ªddcMU2VKa.Wc Ԧg]7Byxo>=h~6~|j#W%.15Uo_㾗/꿃SWt=_/10)tx ˥e?T ^~FE b ž'.lT<69հVDhoL4nk_-Id`66s^t+ )Z4i 5`[dޠ~Ț[jA1uNTSBhƫs1/&cSMiLpKxqV>+aRp!/?bA6ʬh [1gNDZg?_V5N4jpq;|E=WR F_'/g_1 Vo\Ꮏ&W7J '-UhTAC/"FemyBdMe®3p/Zb?Qn6a@$2Nl8#@9le#.!9pzPXIS +q*z4kjs?8.Rv0.Pa]438gI@8 mfh<4T$݈^/A(/mmyTXmeI{KfnԗUf%aܐ[-N*YK c'x녮gljbqw`wr *C[cpɗ7lӫy X0̉oR^U]2 k8[aOTk./ByBllNDǢ zr힇_ORkǘY,J;MQ*˗6S 4nɥbR0K];*y<b,WQ36 ]=t|v `ÃMxC Ac@J3{UԢStcU 9=gT9S 2I?C2]ciiBhFfnb[bml65rexLULY>8(j.l\1UZDsue ,MkEBƞU_%kbvqQ&j^,y82ZFh"-d N`0;lmr wREE0k`, b8:Z%ĻPklN;c+8e_[wc8ߗe;AsGvW඲_dG¿~12[ǨZudҥ}]+p؏@ 0j_cVgٙc:W}XF6W JSpߎ*[Tp*RKr܊%xnʐ,N3_b| D4 Xp C" \iż FypMd&Yj` q4t%*ʇp_`j9Xv-FQlee~ tlbCXZL)DҳErj-<8\N&9cx`51F\YcWGtn_5ÕL+JB;aB hdI|vJHgjbQDG9ya`pfzFԿt 5Zu,4Ŭ5z׹Ah!Bj)V4h$vn+%L2grI(Ҷcɗ& 9sPagI{@T)(xuZj'8 uݩ{@XBȮh9HX ^E k0,- `J?-R#1FU<{U'h4@ x@̢ͫ>(s {Q= Ѻ qIW`RgRPR эoe4GMQ«k:iݟ0A@!\mB/ٍEaEFy&]}7 fMxaBw/9_-ƗOy mf*gB'Nj#/rղ&e0`*#bz^K Ib}(ow rH y\RІfҲ0m KTFRx‡vfPw}60he Wn IJaE1Klp*6?K@ݷ4'Bt )DNOnvBrPxizI WEoq3_;edx旒3Yj^NVvޛ:Ra34鴅Ijͮ$M[WX{"7堂W;Ɓ`8X@+eBPbOk֪hKuhYWʦX[0X+u)\ؼ*˭AbRႋ78#~ H19r\jk_eheB[4LD9F(KYnuĽC燹Eu`m?ilpt_vT`L_ ewXAXq$ե>X*v=H ,HU5pXRW\]2SD#BXt)صr*3Gyc_/e]z^a_)A>)1MATi&*-ʋdٞ[< ׅѦ0 3^EIP21|\ Mבw2xܸm j~EcwKu8D \&<%.-2z4(F^ ^"ʽT@Rv!!m9~g;n-YQq+ U:ז&ZY#T;1KE3ZOoGیO,C~~;_^o4D'LJ%ƈ@C1EH_^C߃hN܁i}.b_P5DEKϙA8DYMiZ̪ruX@CIZzKds X.Ȳ1峼m1L-n YE(k+P\Ns` g5* whYfa}Rռ9,  o^m|,CTc4B*۫?@ qC^~+pbA1X[ "98z6w u ט[39:{wB<,_>ŦNx8}3F{oL}:φʳ)| .HA/*%)[Q^gWjZ(Ⳙd6^9dEٶx ?8Pq~nYH54n!7Z/,%O J:vZkl]=einRVJ%^V5I, f)I! [  Y.aC"cuF%+ڥUi4NDcsPG{9*YAv9p"` ~3)$L~tB֒kno0缩-Ear =t Ҹgh n g 2:*Urb#IG'r ,b_;JSP"#PET`5ⸯ)2,/cYc/L3h@P _r3z>5b]XXW֢ڣ`g#RF@Xp ԡanԘ!U@b--Ӧ̀A򈣹c0ѷ@p9WBAF!k)"Få۟rK.K$jŀvVJ#] n"3sD*̠74Ns) 蔭SWd$o <Կ vJn{1Er_Ob 2vCcp{V".yb_[`j0ૂ9fOӦ7h+ǔ{Lr dlu˵>l|&<1;Sa2`+54qBn|XmVë;QjsX @LSVDolJ5}a\=x͐x?.ގa.ʍ^fIR-E* gf`uD0B` ."Oz(K֭P#$雄R.+W>e7]-O-*PY)36\(\J:=(sr*"3p."rTw)6`F1r /Oirso=ATډ)wOhP_ aevPN`՛ٍXچÑ S^, i@j ;ucgDV ,3 Qp=Owq`,s$.42? G"vƍRryʖS]$DWQ0ȞA`Bn{z$Xza[Ь), L{q XE/3F2c ^'g_$շL(fjw nf)lD b#~e}U lgdm2W Eh\ybB[VD.KeoK(X ̝)/hb6Xskx_1Uu&4TbXUwtx&\Lr o9 jɄ"zk j2 4AAZr鵹!#'S08mM׉)ŵUǡ!)nO1-n\5(ܟTG>TZ\czTv 0@W1 @0mRvRtdJܼR@F2"54BA6(gLkBiNfRQݼ*0<,805 g>=2R/m/5Ev7;"79n 32 %5Щ3Qƻ ȿ "WյGU QSN-·.{ʞm.WjN'Ԍ,(ڰ`Q%,.@|B O[igOx,(cQs*8!A@!Z\"ɲd vvM(hM#$=Stϑ׷8cZvF#SOLِb!: U|P*VZb֡O+SK/cPx r {/"ٽc30a,KTBf{|Ӵ3Cee\[a0ivQ0PZl3\ҳFc Pg VJ.`Uyq=xu+46\h+ m{]xϰf@>#v㰖)^V.1I퀘 [XٖuZ)`&l{٪9kGA(Uut@]T+y_*\p=fZS*X~p܅m9m* u3oԷ+m!Ig7L_buRXj}Pϸ's7 b ObK#~b-(b\sLD, dҏtfR/;ui%^qb to!PggkXAScNC}ojHb1:fKF qN?.e? qPi|v" &bKq tf"0b]>#V.QpswMcz,@[~b~Q骸U&6 [!+/}azHe2f-K8q1N֪UNSe.,7pDh>z%w497a*Ulmh"sb k75{bP]Q'Ee5wO+uUDwfljE0[w!,чɘY+[؊A}/~]hj?eAgQ<35V' ;o/,lEe. Q6pY2 X 7*S;.+־ɚAՌO.YV\3^<#;(F奱O~Q򧐝vX׳*1v0# M)FhFfDfY~߷NGi'AKq 8Ɛl}V;B&񕯍"D(Ͳ|BEN އ/t$e[CvdCL MYmƎ.TWp =xүew@c%h̨UJ5@Ҧ)pneYJ0P-"aq`7oi,!bZ+q( y9Y̻q8R e8I46eY"[RsHhز4A…o^p^7ܢrg2ݟ\;.ị% l%B`QyG\F[[\% Ơ ^8DUzUl*D5'1EW y;p56b( :7v~(MmSVn`:VB+l"o-;̸ՋL:D, fw~rF#: ߴbΘ%&ILdJq \4m.qv*ٵcDW,3,ᭅFBh@f y7Vr1-qU@!yWn-ET%hY6S2ĭR(Kl.8n4 'l#ԸF.c>cVhj#( 9v>nS(ޮ+ L>B,6:Uj,LS^@fP֕ y,he*5dȇvQbr#SAu0Z{̿ĭLV(6/zeо%r8r4#-%Rsi̳<_/CY2 >ks28{K"\1l^Ɲ&b|i!X\H`j+,B >a* l׻.pJr@-¥Uo-SXR7[d3b5CXams2}43اDF2𜟨KwEƍ/Z1Nd2n'޻NTd|Mx)?fm][~ɗv% Hߒ Ar‡1iмX4X&ʓA~.Z.GI^v|BUɘ_4Kڲ,!A;( (`Sd+GuWtq rZSW5UCMRPXL1m8|@g8̡tlm,]ʪe.4M1}v2rITĭ9~PH߆qdhFܙ7M@yUanEE1lZZ0Pq@uK^W gWA1gx-S%n}Y" "3;'0h[Co-("@}b_mANvNo࿹_wCINvGL슕h?/v|~ S~?a-XҲ+?0TZFfN/E?r1Jⱏ*{ؽ1.U.0s*xw|ᘤ$~I&]C=K0]'ܴ 9}x2i4Xd<&Ƞ6)|d3c .&N{\UԶ6]V aшPA]&ͭZsSg򷃛2*M%^M&u 7bV@/rQMfDQÇV1q8"Emw+_{&$J+;@4gO̲ ˳߁>D/ &Z;GVV^m)|R5H&6=`~ѨnYdzR`M".Eb]R}EBoo`Z&Jx ߙfj5=E" | !Y3&p ι"$g;˪$D4,5M]HJ&X`6,FTE ]+CYV]7Y]a .lͱE 4#1b3 h],tby bpX%%su[f Qnh&Q|D \HoA;`4؇!7AnKa3guBpL(o}K'Wq F \PK$̬qIN},?07)-iܳq5 m2Y (#k|gXR'?#QBo Հ!m}@.˚Arr%-]Y<ȋz&L&{[ÿ2zLc : 3-:4/}iwPch#|Vh|ƹ5rGwmf:w2RL+QqnY4شuD Vv Wj1򺶧kAR+egb*ཏK÷7@4%1M]%T%VX@$l{fvsօ ^W9~V{%S$<"; #6_=΄w}RAbV*bGSR\c4;Lf%DqKswᘀKsMF;BGt뗣 2*qVfh6_lr+$GDwv_'<{)cy9qpV\Uj=ʙlRZA"Φs_;#Hbe&!%> Pԡ!4xҔ”_DTÚnT E]7W%"휆Tv.ͿE€8)e*]{FexĶedTƝFH[Lbv @tTC|Z\pJaAVjZ;cT*)n_!%ў26Θow&f,o `,Z8+ g S78n+$.L/"k}{ Jah*Q~q7|ef+ł1Ʉ2[$c,eAiZhU}$s,G/ TYj"AbweCB̓=ux]Cl&l|4ZrPX45ah  y5`=;9e 4A ئ`yk&XQc C5M0ԳA!o*!0e68GLb` –/(\U˥%b'f bVDI>*)|2C5q!/4=[LH6%i!abEVjU2D~X/UQz>Pܾs6*bes XF ኹJ>lb8鋯A:'"xS!MT%(WB {DAn7oy\GGZ!d}Bh %A*(QG{j:d n -,$f [Bqm)"Lh UP5X[. ~+ 3׮ oaߕhTL,E"c@U]h,נcKpro2t[&y3gzK}F^$mUj1}#wcU1˿1Ífo'VdzIH[tL)ipN9Z*dآiZeQxã'fgyO] jRq To(Xٖ`2)⩑1cx< ٘z0A H򙋖g*C_A,(7!@0@[0I&уTVtNnWN ٘,.'m_*un֠q-%j6f)R ^{K+UF6T/q>6QVeUq3-D%VW[weoUM7ž#f@UZ 5(/$ei+loqi Ւ֮i5J;a3VŸW*[ZUPH8&;$4%a|lq@&oD8@oo/.R.^,?WIBgeЗ< 򦒺NT=2c!ڲ@?XcK xaV]> Lb}SjI4q9LRan}(:Ȍdy1Q0)뙈c7yn*FQ[``mf5{ &hes3=j]RYwjR)4QT*WiVhSl9bʁXj! fkLJ0;wKnT,hyBk>heWK)7=<Wjp*9 eXj7r|aPzv5vq{tS<@*!E2P ^pA6QIGgYvHKxgf!nw+LDVdWzdw ֗=HK0W^ yYww P1߷UKGmoiYNe-=n9sEJxǖ`97'NG EU8ݙƕe C 1J-pDIh%Z x?'a97}O64@tLR|G&theԔ B("d&a.Hee.q~h [0>FV/x -mt@4Z)dEY[fc?2LSASM )6ALU#$sNPZx9qnŸ4#c 4@wt0_JؓU;18#[ SFVXcyNM;*)*̷KGvA "0Kҭ,J"xӏ(jaYŘ7+\ ɔLQa2ih ;TstTD丘 Gf@{Љ?r zVفQ9n_׆v)wڋXaS|۽],q|E0!:RS= Zd/89^+kb uDWl)v8 Af{4(Yyn̰䯍{ {Z6DkRQ@  C^&jD!6q~ _@zH|FZ ՗0Qq{I|"Pz-AC6L-D)4spq`b0^c Kv,(U|EAnp6xށ\.c ^#(1TeLO%bR >JS 0eAH6ʄiWQ4;eTw}xD7 3;'%s~Ҟhܫx};&2}יnnљP+, qe)O0C men6Zl çDRT2|nʎg~ s~`/QUa_{Nߘ3 ADD] 5w `N AV(Ķi5NۻBO'~mt1(\BPM7uF!md ,rlآ^WF'v52PZ܈~'0ZV6P3,K">6[?b/s;vr^QK ZU;xT^LKg,n4/Ҋ(I9 V wC֠*Zp #O5zn@[`)w*K5͙nBK0k9̍5ey.~>b zqZC|Oٖg[3h |3n|Jpnr>ʝ?+*7VPLMb[Qݍ2,e ^ 7:$Di-I:{ڢRS`ʭݞd?]U/ Y8G+}٪Ke ɮPGI"ɭ `4Ɇn8`wlH3% l=V]$0B#N_c/aDĠS4J3z`WeCEwTb9|o1E/bw"v=q7:i.CCheEŰ('sMr֮4"`d{ogԣj7b kR@b/u0(f&ں7ĺ3wQ4y%,4_I[MKS#' K;#;a(:ebJs~WCIxVa]O7o8 Id +/۰'`nMO1I\> 79{Nu7@IPttn,4qW@UZ" v@{>c)u|MȽWI [N"#V>"AKS!vfzK8Cؗɦ[xBRaa5ŹLs`˖!PÝQK'AwCݧs&j)[Y}ٖ2O1jA%:*eKP^}sƎ* + =Rh(}"!>l Zw>e#Er+,Os|^RF A/-=$g,ݡ~1l40Bd)m'Aߚ1J,.kL>E' c~o o0\d{_$טd'\N f@U@{$X 8{'5?hcn5чJ>e/D{6U;fW9XG}[!q13a@,6W3whdy!fQiw*XEx_XfM A`̢\Y,qzc,}{&vndv˫ۂTļbŃ._E˗J2~^b8e203E@EA@.Y'&_]i< qy|a8z#6HRŜc#4/xPJyy7`1t=UCkaϸZpfDpG?A2$h(A5v67 NUocEUy{m%Mȗg0^<%"Ҹķ0UhUe;ZqqF$n}s9J(tV(a9\;I 7p:1vkYFmf:+w)N٥v/V+Lʿ3ATo&>IH&f誶*\sY;k[Yг1"& `Y{pK|EWe;%V4Z*Tۇ1b/"pA AKja q͌.j9U1[BG8#,d- sZLpt`oiU> )&Y9!.,RwRYPXC"FXlxln9۩3Y< je,e+6"v]۸6V>p#q7M|Z& M'i4xѭExq1PѵʉQ6hN&OeӦA"hd0ea1 mҎZk %j=Wt RR? G"b*Km~hW1/e9|sNnOZy UQV8_ʃQ['y$1,A98`rUX~܍@A,AAsYVjߘ7.i\L˩+e"HBhQh*{֫{/Yj*SM!dQzc:0ˌ9`~^ 56e&HVaX*uor;M*, hYi̻*0^a#ܫDފN: 8!x+k Xc*8>S0{\ Zô6.XP. B`sCZ?(ca K0K 1:ZE[.1,^;&:x*2{%; robTohV+g>zCNbmeawU+K0b8|=g*d4=ĨTRG~bj*) oAi0\EK 6"Tþ"l 4- vN>LA*{%ADXPDwao0Y}r0GȏZfw?Mܰ.iw.( S& 9UPo.sٜKo/><$UFx%071>F6KHq Dz\pC厭w% .*%`x8nұ,݌tTBext.rE.1vS UdA z4\k*LA.h{?IGDt)4XnBlĺvg},UܼKn̽^tgLnxJ9^3l"$aȁ LM/h;ʦz+%l5 ۫( ӵ8 z[.1uW8^ZoKswXKLb/ q"D= xD\b^}VWYL+dV̦h /p/,BDwTzo{!՝o苓f#kf࠱>ABl](ɨ' CfjhYi5aB A/1q )E(o2?Ɗ`{>6"Vdlt֮*x:Q`6dіZĿDW3z]#`4q*O䕝:i@Pr4xw-^+\9yA̴Y9&܉t +U_XPhq J dœ[_:Lo|!գ)kO4/2z0?!wO8o [>(jbT7/GcdGEc Md75!v BF<1,!*"3LtJK%NzBu OYG1bH{Kpeo%o a ʼney06&Lp5 hP^cXU{?1qLeÊ#}0[͑XWT<#+%K\SH0_I$UMҷAt#5<߆t >J=%SG#A>Jt*#m/D0O ހ~п ,W_P#H`.%-ۿL _+%7/*dZgYA~N|,H+3ƩO7~na0aWZZӱ>.6w@ TVm }BGDx%)dV4&0 f,4Khulpb\ӱ 3*Q0iUO([FO;NIED BII0_r1%l%o1v"09.!pkj4XRXpѝ|+6prŘn] n^R# U<!#(8Hiem - [N`J.T҉S" ROhk %8 u >)geQx`.q,qZ"i/e0>UqNJ\x @+rհnahG7yB!(D(̳-X'4f9[6;YP̦t35<]*%[ɇ]@nOD6#Nҽ^0yE^oh0`e;&-@yd(tx)1p'`R^E-5{Ga[_-9TեxL+m8ipV\rXqxE.ձwNupܶ9ycu)فR9:ҨMވj1Yj*!u]$\-l /£y\1i撬=Ls>Qiw`fT0dШimiX&e8TUv"( Eȯ#cpN,z0n2ZwYZP*A oga`@ Q/i -@蝪FZDM©_0 :\ΦuM\ E6V8ք GB@l|ah+Ȱ*Q'lE^fop] sYsnarX" CR%8h 䔥neIp3F0?0 hyv!^fji4eNK[x8Gj f%af/?c/ZK mW?K9A]!fvq-{h4(Q @+ZG嗟v\*,fEl^&8q[LP"rlj4 XZ;B"US l>).&. \P{)pfx2>/¬r/xx/F%Q_L|?!Fc|U_ n](N|B#P86Y蛁#agAEA18A)QxJGr.0T3W*̣]땊)*W IV]w"]Y .VS.9yT(հe3[ķc wTQ!|Nڌ\ZKxE :'v"ۊkUE3F\ g'uk,Q{45JᎾp*4J7rCe./5.a–-OlNF#+Ǹ*Ih+4{ 1%zhS<˛--,[9=S1q12˷`Hsc0S\ U}Y>X.Q)(7+5)Jw3eRkAv`s[@8weg\@5 -L@ o /1pV{Vrhde &Z b{.uu(Ec 72ؖdHg&4S-Ūyfd܊hhrܸw !ouZюoU2;`ٕX+rJRZ=Մُh6\@vąMψ:C0qIP"X9 5~.YT#DS_4\ :|8"OEs; Z*ٰ(G6gBPKߐ|{FdP3JI^ OxfeMܮG@oQ?vnR (cH ؐXm@~@Ǡ¶!EL7=VPUwx `/~1ZsL+F^* (t$SDKXwQ!aPM zŗvpj%=!h8%2FfzEiYBzO &bt hUñ#\+*d/+?nPap!N@F.ɖ{ a(%Ǚ B\b&t}gii[lG&6h<D;~.LӔ|dQ,;X4^kQl̉@5sXv0\#c!(mjYJB<76ŵ1U|. ZKffx3HD9) Icc.#H1e) ֡ܬ$]|O!x3r6c.m&88Hܗ+Dj@о&{0EQJiT8E QM/&g[oFO.[(jX$w$ݰ9:XMҸ!n%U-A@ ƒ6@~ |1H06T^ҎKGh ʦx~ݡ(]Z-Vbfod0'ZWlsC*4)8ahP#I%D\m=4 +>VM'ZHUFKwy](BQk - .2ϔT axUvz.YWX9ʨsacnC@385)_#ꇡNB0]Aġ,=&40 `'qolø ~+ R-]Q*OG4)!⢓B`= JwSU}پ,Xbcn\Me7'VL)bōm!>,ߊ_TA99P)Uce\STF=KpA FB,Xk[.qfgҟqݸjj: >b2m& [;z,0a;b D?3xhX7bʗ%p4{;#._15߷.T;oQq"'kaP^HװE#]F]V=t[)l=%e 3qnz/4; Uh JtFĺS[fV*,XF3Z@Rk4!ayr?˨9.ha a Jހb7<_اH ,uQ2y ;֭C\wz!106,F{y!ώP7D*w+bzIrOܵqK}6dDLO %,Sz')a SK2+%Ĥ7 *%5AЇ k EP8hdR[8-5Ƃ#d:dإJ卛{:[s{O":(d+ R]d] %Up7(EGs]#dǁ%7B›Hdp/w_ kjnY N1򂶷xH׶0 ɿy.W^y px{>!֤mmzT.wuM©!r xw^ݒKUT.5V0Ʌ aq w%Fyfj 7mK Kt'3UL.5,ɿ~mbĜMͳNp=1+f5Chbw\LȆ%$RRM#..9R炇?E!Db(;ɛٹ+x~ѱ&f#׉IND40JD[|/ /N V5T>%4k&]YcnR\,2p+QcK2<nW,̡RȬI6ylu$ |DmfK)-МSeA_æi;?iq&Vjterxc]q,YP\K`'Ld"&l~N +9uCaRmލR&oFpTFW١Z`]۟$uen"e~MYu}@{PmB\:B5( SDgR..)soc`"fy)^P!P:f3%`^J1jwHQ@*a _snpl>eCT S+'{.V U(1PXl@2AZ-|9yiW #O %w3k6afjJ;cM0yJ>FlQAPVS-賠86ja@$Z7eKf:?!k * xB cs̽UJ%+d|M,@ 8{scmÈW-65 KufrԸ K"(!^y?FeW@`lf~zYSg[T3f7@nez]C.0@]r,vX*w2 fj+tUC1^x{Ǹb;OB qG,ޣDEzj-`&lwءy ?gAĴ Cp゠#l 7g1{XU0.m7;0U\.RβUU0 [[Fp\"9@QqAlLveL兘-a+X ~O$Kdǖ:n!WhL؈-֏ܻ/\qR͌ÌZ8}I-jƘ`eK07!% /UɚnT",o(p~P-г&.~cJ#"@&_, ̂°ަe FT 9L o2p 2P+d`hg{vo_1-~cO1TOw K,L栂ޕ&D@u.٬ƊU.}ȚV4dU[pJ|ҖU dX¨̧Yȷ-UhDvѳaR0Y%obQT^pGȈ-K(Jdv -` ûw+D^ NϕR!M_IWGRЂ{ M%jNkFR[;G3 PwZ +/aEu>E#dyVf xXov7$%)ڹG)0zj#AjY mxFū D.2#"vp/p6=s(ܮQ悆X`m7XJȇgq`Ӄmka ̶M)RB`[ &-%kTӪ10q6 0z+#+f#A|$o%C?؏TWLXx؎TY6|Ԫa0@HH0Td ZVeArQ|m"4 >me,5; +DB(lR[̯$ P )BCD\`Mwa97b(&3h[tԁeQx(\$b#Bs 095BEуnWS8)|D=tЕPKh]-MXP2f#Rp:V's0vx<&=uiو# mm  s7M16` KdV(`ef<^ҸL[\"]1ޠ|0<^UycG$9?bˈXrbjhx @RsWKޭ6ЃW,pWf|j KU;\Y^a! M(.Xc*86 er+U'F_<]=q\30C Ҍ9.,Bs 2R͌J4 ǵHÃFYPGÌ3^̗`z3n l I4m̃v7Ohdr5* E& 5`_Mie3%hɉp1Z(k@*E'^<+uɉ]pCL[Y%TlDUt ;V']kp0]& \D(:&iu瘿Hmy/%^& ei-V,))> Z?e[4!ݓb)" W{+eӕ5y2,w,W9zK/B{f­|6 F&p+z{P鵗Rl<3{z*Ux+qa|"ysBu@zYD,9^K5K}P-艔K>f躋y/#2Gdqn` ^ )Kaǔ) \WB-90 ys2[Lfc|H*̭@IK95!t'4L2EM!21 kj92 @L̕ͅv0ӄ11|UpQ79En`1,g'yx (K k!0ґ:ROҞ{Ѐ0md=jEdۮ1 B%j Utn]^cmԤL5,‹cl5Z muzZ wK7uX 1mBO+k0q6L7V` sNa=HC:a4BÁCݿeͿ)y-4F@ H9Y,ړ!|  ,ض\j`(Ja}NNQ t,@q]EBPF.ES-ҡFF4{1-GSIjfuQz|3#3j.Fc8z1hGst옩(.UU 3xc v%,&UxC +B6=ܾOd802›KnnAo}XgF͢cϽ/N?1R>-+u^ ȍxUnC`UQvCw;_pT-0|~ 85[r>V7Jm{>(Ÿ%DSfŨ~ Dq$@ ˂pG8eU&6 xcbC6Y%%%0p!mh+G@a-P2-=ڵ+Mvek,H!*gy3ʀ2r% 3^B6(wP= 4)G4TZEH3V<[A@#!@>)vxa'/cmJ(Pa8^c7}" ʯ+9[/P\@@#PV]QmObC)`ܡ'}PŏiQ'6;h4kY%̳1"![18eq`áYi.D>yFg.Z?aIu`}B_yeVA 7UKA9ؠΎuN.a! *iAM-*F LiĬvܵ~vm>NH`KeSFY3B*cN2F#sw˖䬹AeYZ R\KX{׬v$.k ʿ3Ef֦'W=0T_Kck^n-ZB0\RJ ReA%u5P ɫ ߃ kɹ_l m p,edgS{m{s ,hzki͡y-7͑ m+=)xSp{gG`XJcU9fhbL@4. ng\fgţl~&,&6E-n,&B LejQ Fy4TZ6_RR`t^5 4i棣}9鰘"O,f?LYA˜y]) ]+0gQBI FqU/]ыx0zL)%P3Ig̼.) 12W%\j~JٚK mZal٧a @jWC/71 L+eU,<L5dR,69ơ+Zn? rȌ.+)2CAR#;25< @;=1O67:Y"~5I%]8UċBk0ǥv&Wj8PXJ,68HY;U~SGJ~dꩤyEL֒T/yd ŰǴh18\Rf$Ӝ~V 4l֌*6é<}"u(c xavf ۹:%% %؏sf1#1!oZ_8\`T֮Xg&lC`ULP9MLFt}lk\@+Mʈ1x(K3b-&Ҕj6@`!QV.Xl4>Ax啇6URLD6r7KkԲM/u TDA/k6@Q5QY a*6Xg +nԠ">pK3Q.8qMl.*/G0O`jyߋF[W.<\Tqh8MHY#7]ŀi4"a;qG Nw;%F6FXSWuf1_ht+X9IzJNjS`H+meyb7 1^I_ Zns/ ;-2 Jlo.30.w=_5R(Wph袹L:/1_k6h-# &4@s?PF0R(s'"8NLq5TSHMʱV ᴄ Zv%8jxQ*E,(9EjhةH m7x'.L@lpZ`\Qa,Y,65k6g^ˤe`K>'.Vrfq LJ М6isV*}0Q;Eo)W̒jзm wJ-0%w?A tj7S$q^if.LF  Ma!0ۺOИtèXZ$)g v}m? *>e\Ee>3V>PU߂)v0]ַ0#L9WٌTBq[m_|{㢚Jȵ T[e 8VܬjR^ TF4 hdlcdp b[j^^%L7pzfQ>udAc~tPc3TA¯n{Q2q~ 9VK:T@aL/pL0,L:eo!* Jh.jR󽌤n@>%%%TA.23<Ŏ Vn?fHm1dSLüe^Ⱥ+p2(cظ4myaaZh^:CI䀛unr4$6֌;FXuBzc?3FǵyHr=6?*=Vd'## =1 JeB[oJ(Pji&ܥ[dFTyDP!:S:[Hp@t!?drߕJnʌÍ1=Qh'Q>VXj":ˇTXd@HfX Q7+!~:'J1Fd̞` )hӥ%* )fO  ,%j\-~Fzad3/KYEky'.˦4%\/.<U*4)enlX..>hip&Y@ɔ+TUƌqTńܭm xauyn8ͽ՚fƽRiO7cN <N\O*R\W剪6S^;p+G33FEb{YXU|ehG<" v(T¥1J]!w"81FjUN#-;́?U 2Ytw&#)` $I.@4\JhʔpVXN&Ռ!~ C MC}+
Linux 4gvps.4gvps.com 3.10.0-1127.18.2.vz7.163.46 #1 SMP Fri Nov 20 21:47:55 MSK 2020 x86_64
  SOFT : Apache PHP : 7.4.33
/opt/alt/python35/lib64/python3.5/site-packages/
38.135.39.45

 
[ NAME ] [ SIZE ] [ PERM ] [ DATE ] [ ACT ]
+FILE +DIR
__pycache__ dir drwxr-xr-x 2024-08-13 01:16 R D
aiohttp dir drwxr-xr-x 2021-02-14 07:03 R D
aiohttp-3.0.8-py3.5.egg-info dir drwxr-xr-x 2020-07-14 01:15 R D
multidict dir drwxr-xr-x 2020-07-14 01:15 R D
multidict-4.1.0-py3.5.egg-info dir drwxr-xr-x 2022-07-24 06:06 R D
playhouse dir drwxr-xr-x 2024-08-13 01:16 R D
psutil dir drwxr-xr-x 2024-08-13 01:16 R D
psutil-5.4.7-py3.5.egg-info dir drwxr-xr-x 2024-08-13 01:16 R D
yaml dir drwxr-xr-x 2020-06-15 13:37 R D
yarl dir drwxr-xr-x 2020-07-14 01:15 R D
yarl-1.1.1-py3.5.egg-info dir drwxr-xr-x 2021-02-14 07:03 R D
PyYAML-3.13-py3.5.egg-info 1.479 KB -rw-r--r-- 2019-09-27 10:04 R E G D
README 0.116 KB -rw-r--r-- 2019-11-01 23:02 R E G D
_yaml.cpython-35m-x86_64-linux-gnu.so 247.883 KB -rwxr-xr-x 2019-09-27 10:04 R E G D
peewee-2.8.3-py3.5.egg-info 9.076 KB -rw-r--r-- 2024-07-08 09:38 R E G D
peewee.py 173.352 KB -rw-r--r-- 2016-08-25 18:17 R E G D
pwiz.py 6.8 KB -rw-r--r-- 2024-07-08 09:38 R E G D
REQUEST EXIT
# May you do good and not evil # May you find forgiveness for yourself and forgive others # May you share freely, never taking more than you give. -- SQLite source code # # As we enjoy great advantages from the inventions of others, we should be glad # of an opportunity to serve others by an invention of ours, and this we should # do freely and generously. -- Ben Franklin # # (\ # ( \ /(o)\ caw! # ( \/ ()/ /) # ( `;.))'".) # `(/////.-' # =====))=))===() # ///' # // # ' import calendar import datetime import decimal import hashlib import itertools import logging import operator import re import sys import threading import time import uuid import weakref from bisect import bisect_left from bisect import bisect_right from collections import deque from collections import namedtuple try: from collections import OrderedDict except ImportError: OrderedDict = dict from copy import deepcopy from functools import wraps from inspect import isclass __version__ = '2.8.3' __all__ = [ 'BareField', 'BigIntegerField', 'BlobField', 'BooleanField', 'CharField', 'Check', 'Clause', 'CompositeKey', 'DatabaseError', 'DataError', 'DateField', 'DateTimeField', 'DecimalField', 'DeferredRelation', 'DoesNotExist', 'DoubleField', 'DQ', 'Field', 'FixedCharField', 'FloatField', 'fn', 'ForeignKeyField', 'ImproperlyConfigured', 'IntegerField', 'IntegrityError', 'InterfaceError', 'InternalError', 'JOIN', 'JOIN_FULL', 'JOIN_INNER', 'JOIN_LEFT_OUTER', 'Model', 'MySQLDatabase', 'NotSupportedError', 'OperationalError', 'Param', 'PostgresqlDatabase', 'prefetch', 'PrimaryKeyField', 'ProgrammingError', 'Proxy', 'R', 'SmallIntegerField', 'SqliteDatabase', 'SQL', 'TextField', 'TimeField', 'TimestampField', 'Using', 'UUIDField', 'Window', ] # Set default logging handler to avoid "No handlers could be found for logger # "peewee"" warnings. try: # Python 2.7+ from logging import NullHandler except ImportError: class NullHandler(logging.Handler): def emit(self, record): pass # All peewee-generated logs are logged to this namespace. logger = logging.getLogger('peewee') logger.addHandler(NullHandler()) # Python 2/3 compatibility helpers. These helpers are used internally and are # not exported. _METACLASS_ = '_metaclass_helper_' def with_metaclass(meta, base=object): return meta(_METACLASS_, (base,), {}) PY2 = sys.version_info[0] == 2 PY3 = sys.version_info[0] == 3 PY26 = sys.version_info[:2] == (2, 6) if PY3: import builtins from collections import Callable from functools import reduce callable = lambda c: isinstance(c, Callable) unicode_type = str string_type = bytes basestring = str print_ = getattr(builtins, 'print') binary_construct = lambda s: bytes(s.encode('raw_unicode_escape')) long = int def reraise(tp, value, tb=None): if value.__traceback__ is not tb: raise value.with_traceback(tb) raise value elif PY2: unicode_type = unicode string_type = basestring binary_construct = buffer def print_(s): sys.stdout.write(s) sys.stdout.write('\n') exec('def reraise(tp, value, tb=None): raise tp, value, tb') else: raise RuntimeError('Unsupported python version.') if PY26: _M = 10**6 total_seconds = lambda t: (t.microseconds + 0.0 + (t.seconds + t.days * 24 * 3600) * _M) / _M else: total_seconds = lambda t: t.total_seconds() # By default, peewee supports Sqlite, MySQL and Postgresql. try: from pysqlite2 import dbapi2 as pysq3 except ImportError: pysq3 = None try: import sqlite3 except ImportError: sqlite3 = pysq3 else: if pysq3 and pysq3.sqlite_version_info >= sqlite3.sqlite_version_info: sqlite3 = pysq3 try: from psycopg2cffi import compat compat.register() except ImportError: pass try: import psycopg2 from psycopg2 import extensions as pg_extensions except ImportError: psycopg2 = None try: import MySQLdb as mysql # prefer the C module. except ImportError: try: import pymysql as mysql except ImportError: mysql = None try: from playhouse._speedups import format_date_time from playhouse._speedups import sort_models_topologically from playhouse._speedups import strip_parens except ImportError: def format_date_time(value, formats, post_process=None): post_process = post_process or (lambda x: x) for fmt in formats: try: return post_process(datetime.datetime.strptime(value, fmt)) except ValueError: pass return value def sort_models_topologically(models): """Sort models topologically so that parents will precede children.""" models = set(models) seen = set() ordering = [] def dfs(model): if model in models and model not in seen: seen.add(model) for foreign_key in model._meta.reverse_rel.values(): dfs(foreign_key.model_class) ordering.append(model) # parent will follow descendants # Order models by name and table initially to guarantee total ordering. names = lambda m: (m._meta.name, m._meta.db_table) for m in sorted(models, key=names, reverse=True): dfs(m) return list(reversed(ordering)) def strip_parens(s): # Quick sanity check. if not s or s[0] != '(': return s ct = i = 0 l = len(s) while i < l: if s[i] == '(' and s[l - 1] == ')': ct += 1 i += 1 l -= 1 else: break if ct: # If we ever end up with negatively-balanced parentheses, then we # know that one of the outer parentheses was required. unbalanced_ct = 0 required = 0 for i in range(ct, l - ct): if s[i] == '(': unbalanced_ct += 1 elif s[i] == ')': unbalanced_ct -= 1 if unbalanced_ct < 0: required += 1 unbalanced_ct = 0 if required == ct: break ct -= required if ct > 0: return s[ct:-ct] return s try: from playhouse._speedups import _DictQueryResultWrapper from playhouse._speedups import _ModelQueryResultWrapper from playhouse._speedups import _SortedFieldList from playhouse._speedups import _TuplesQueryResultWrapper except ImportError: _DictQueryResultWrapper = _ModelQueryResultWrapper = _SortedFieldList =\ _TuplesQueryResultWrapper = None if sqlite3: sqlite3.register_adapter(decimal.Decimal, str) sqlite3.register_adapter(datetime.date, str) sqlite3.register_adapter(datetime.time, str) DATETIME_PARTS = ['year', 'month', 'day', 'hour', 'minute', 'second'] DATETIME_LOOKUPS = set(DATETIME_PARTS) # Sqlite does not support the `date_part` SQL function, so we will define an # implementation in python. SQLITE_DATETIME_FORMATS = ( '%Y-%m-%d %H:%M:%S', '%Y-%m-%d %H:%M:%S.%f', '%Y-%m-%d', '%H:%M:%S', '%H:%M:%S.%f', '%H:%M') def _sqlite_date_part(lookup_type, datetime_string): assert lookup_type in DATETIME_LOOKUPS if not datetime_string: return dt = format_date_time(datetime_string, SQLITE_DATETIME_FORMATS) return getattr(dt, lookup_type) SQLITE_DATE_TRUNC_MAPPING = { 'year': '%Y', 'month': '%Y-%m', 'day': '%Y-%m-%d', 'hour': '%Y-%m-%d %H', 'minute': '%Y-%m-%d %H:%M', 'second': '%Y-%m-%d %H:%M:%S'} MYSQL_DATE_TRUNC_MAPPING = SQLITE_DATE_TRUNC_MAPPING.copy() MYSQL_DATE_TRUNC_MAPPING['minute'] = '%Y-%m-%d %H:%i' MYSQL_DATE_TRUNC_MAPPING['second'] = '%Y-%m-%d %H:%i:%S' def _sqlite_date_trunc(lookup_type, datetime_string): assert lookup_type in SQLITE_DATE_TRUNC_MAPPING if not datetime_string: return dt = format_date_time(datetime_string, SQLITE_DATETIME_FORMATS) return dt.strftime(SQLITE_DATE_TRUNC_MAPPING[lookup_type]) def _sqlite_regexp(regex, value): return re.search(regex, value, re.I) is not None class attrdict(dict): def __getattr__(self, attr): return self[attr] # Operators used in binary expressions. OP = attrdict( AND='and', OR='or', ADD='+', SUB='-', MUL='*', DIV='/', BIN_AND='&', BIN_OR='|', XOR='^', MOD='%', EQ='=', LT='<', LTE='<=', GT='>', GTE='>=', NE='!=', IN='in', NOT_IN='not in', IS='is', IS_NOT='is not', LIKE='like', ILIKE='ilike', BETWEEN='between', REGEXP='regexp', CONCAT='||', ) JOIN = attrdict( INNER='INNER', LEFT_OUTER='LEFT OUTER', RIGHT_OUTER='RIGHT OUTER', FULL='FULL', ) JOIN_INNER = JOIN.INNER JOIN_LEFT_OUTER = JOIN.LEFT_OUTER JOIN_FULL = JOIN.FULL RESULTS_NAIVE = 1 RESULTS_MODELS = 2 RESULTS_TUPLES = 3 RESULTS_DICTS = 4 RESULTS_AGGREGATE_MODELS = 5 # To support "django-style" double-underscore filters, create a mapping between # operation name and operation code, e.g. "__eq" == OP.EQ. DJANGO_MAP = { 'eq': OP.EQ, 'lt': OP.LT, 'lte': OP.LTE, 'gt': OP.GT, 'gte': OP.GTE, 'ne': OP.NE, 'in': OP.IN, 'is': OP.IS, 'like': OP.LIKE, 'ilike': OP.ILIKE, 'regexp': OP.REGEXP, } # Helper functions that are used in various parts of the codebase. def merge_dict(source, overrides): merged = source.copy() merged.update(overrides) return merged def returns_clone(func): """ Method decorator that will "clone" the object before applying the given method. This ensures that state is mutated in a more predictable fashion, and promotes the use of method-chaining. """ def inner(self, *args, **kwargs): clone = self.clone() # Assumes object implements `clone`. func(clone, *args, **kwargs) return clone inner.call_local = func # Provide a way to call without cloning. return inner def not_allowed(func): """ Method decorator to indicate a method is not allowed to be called. Will raise a `NotImplementedError`. """ def inner(self, *args, **kwargs): raise NotImplementedError('%s is not allowed on %s instances' % ( func, type(self).__name__)) return inner class Proxy(object): """ Proxy class useful for situations when you wish to defer the initialization of an object. """ __slots__ = ['obj', '_callbacks'] def __init__(self): self._callbacks = [] self.initialize(None) def initialize(self, obj): self.obj = obj for callback in self._callbacks: callback(obj) def attach_callback(self, callback): self._callbacks.append(callback) return callback def __getattr__(self, attr): if self.obj is None: raise AttributeError('Cannot use uninitialized Proxy.') return getattr(self.obj, attr) def __setattr__(self, attr, value): if attr not in self.__slots__: raise AttributeError('Cannot set attribute on proxy.') return super(Proxy, self).__setattr__(attr, value) class DeferredRelation(object): _unresolved = set() def __init__(self, rel_model_name=None): if rel_model_name is not None: self._rel_model_name = rel_model_name.lower() self._unresolved.add(self) def set_field(self, model_class, field, name): self.model_class = model_class self.field = field self.name = name def set_model(self, rel_model): self.field.rel_model = rel_model self.field.add_to_class(self.model_class, self.name) @staticmethod def resolve(model_cls): unresolved = list(DeferredRelation._unresolved) for dr in unresolved: if dr._rel_model_name == model_cls.__name__.lower(): dr.set_model(model_cls) DeferredRelation._unresolved.discard(dr) class _CDescriptor(object): def __get__(self, instance, instance_type=None): if instance is not None: return Entity(instance._alias) return self # Classes representing the query tree. class Node(object): """Base-class for any part of a query which shall be composable.""" c = _CDescriptor() _node_type = 'node' def __init__(self): self._negated = False self._alias = None self._bind_to = None self._ordering = None # ASC or DESC. @classmethod def extend(cls, name=None, clone=False): def decorator(method): method_name = name or method.__name__ if clone: method = returns_clone(method) setattr(cls, method_name, method) return method return decorator def clone_base(self): return type(self)() def clone(self): inst = self.clone_base() inst._negated = self._negated inst._alias = self._alias inst._ordering = self._ordering inst._bind_to = self._bind_to return inst @returns_clone def __invert__(self): self._negated = not self._negated @returns_clone def alias(self, a=None): self._alias = a @returns_clone def bind_to(self, bt): """ Bind the results of an expression to a specific model type. Useful when adding expressions to a select, where the result of the expression should be placed on a joined instance. """ self._bind_to = bt @returns_clone def asc(self): self._ordering = 'ASC' @returns_clone def desc(self): self._ordering = 'DESC' def __pos__(self): return self.asc() def __neg__(self): return self.desc() def _e(op, inv=False): """ Lightweight factory which returns a method that builds an Expression consisting of the left-hand and right-hand operands, using `op`. """ def inner(self, rhs): if inv: return Expression(rhs, op, self) return Expression(self, op, rhs) return inner __and__ = _e(OP.AND) __or__ = _e(OP.OR) __add__ = _e(OP.ADD) __sub__ = _e(OP.SUB) __mul__ = _e(OP.MUL) __div__ = __truediv__ = _e(OP.DIV) __xor__ = _e(OP.XOR) __radd__ = _e(OP.ADD, inv=True) __rsub__ = _e(OP.SUB, inv=True) __rmul__ = _e(OP.MUL, inv=True) __rdiv__ = __rtruediv__ = _e(OP.DIV, inv=True) __rand__ = _e(OP.AND, inv=True) __ror__ = _e(OP.OR, inv=True) __rxor__ = _e(OP.XOR, inv=True) def __eq__(self, rhs): if rhs is None: return Expression(self, OP.IS, None) return Expression(self, OP.EQ, rhs) def __ne__(self, rhs): if rhs is None: return Expression(self, OP.IS_NOT, None) return Expression(self, OP.NE, rhs) __lt__ = _e(OP.LT) __le__ = _e(OP.LTE) __gt__ = _e(OP.GT) __ge__ = _e(OP.GTE) __lshift__ = _e(OP.IN) __rshift__ = _e(OP.IS) __mod__ = _e(OP.LIKE) __pow__ = _e(OP.ILIKE) bin_and = _e(OP.BIN_AND) bin_or = _e(OP.BIN_OR) # Special expressions. def in_(self, rhs): return Expression(self, OP.IN, rhs) def not_in(self, rhs): return Expression(self, OP.NOT_IN, rhs) def is_null(self, is_null=True): if is_null: return Expression(self, OP.IS, None) return Expression(self, OP.IS_NOT, None) def contains(self, rhs): return Expression(self, OP.ILIKE, '%%%s%%' % rhs) def startswith(self, rhs): return Expression(self, OP.ILIKE, '%s%%' % rhs) def endswith(self, rhs): return Expression(self, OP.ILIKE, '%%%s' % rhs) def between(self, low, high): return Expression(self, OP.BETWEEN, Clause(low, R('AND'), high)) def regexp(self, expression): return Expression(self, OP.REGEXP, expression) def concat(self, rhs): return Expression(self, OP.CONCAT, rhs) class SQL(Node): """An unescaped SQL string, with optional parameters.""" _node_type = 'sql' def __init__(self, value, *params): self.value = value self.params = params super(SQL, self).__init__() def clone_base(self): return SQL(self.value, *self.params) R = SQL # backwards-compat. class Entity(Node): """A quoted-name or entity, e.g. "table"."column".""" _node_type = 'entity' def __init__(self, *path): super(Entity, self).__init__() self.path = path def clone_base(self): return Entity(*self.path) def __getattr__(self, attr): return Entity(*filter(None, self.path + (attr,))) class Func(Node): """An arbitrary SQL function call.""" _node_type = 'func' _no_coerce = set(('count', 'sum')) def __init__(self, name, *arguments): self.name = name self.arguments = arguments self._coerce = (name.lower() not in self._no_coerce) if name else False super(Func, self).__init__() @returns_clone def coerce(self, coerce=True): self._coerce = coerce def clone_base(self): res = Func(self.name, *self.arguments) res._coerce = self._coerce return res def over(self, partition_by=None, order_by=None, window=None): if isinstance(partition_by, Window) and window is None: window = partition_by if window is None: sql = Window( partition_by=partition_by, order_by=order_by).__sql__() else: sql = SQL(window._alias) return Clause(self, SQL('OVER'), sql) def __getattr__(self, attr): def dec(*args, **kwargs): return Func(attr, *args, **kwargs) return dec # fn is a factory for creating `Func` objects and supports a more friendly # API. So instead of `Func("LOWER", param)`, `fn.LOWER(param)`. fn = Func(None) class Expression(Node): """A binary expression, e.g `foo + 1` or `bar < 7`.""" _node_type = 'expression' def __init__(self, lhs, op, rhs, flat=False): super(Expression, self).__init__() self.lhs = lhs self.op = op self.rhs = rhs self.flat = flat def clone_base(self): return Expression(self.lhs, self.op, self.rhs, self.flat) class Param(Node): """ Arbitrary parameter passed into a query. Instructs the query compiler to specifically treat this value as a parameter, useful for `list` which is special-cased for `IN` lookups. """ _node_type = 'param' def __init__(self, value, adapt=None): self.value = value self.adapt = adapt super(Param, self).__init__() def clone_base(self): return Param(self.value, self.adapt) class Passthrough(Param): _node_type = 'passthrough' class Clause(Node): """A SQL clause, one or more Node objects joined by spaces.""" _node_type = 'clause' glue = ' ' parens = False def __init__(self, *nodes, **kwargs): if 'glue' in kwargs: self.glue = kwargs['glue'] if 'parens' in kwargs: self.parens = kwargs['parens'] super(Clause, self).__init__() self.nodes = list(nodes) def clone_base(self): clone = Clause(*self.nodes) clone.glue = self.glue clone.parens = self.parens return clone class CommaClause(Clause): """One or more Node objects joined by commas, no parens.""" glue = ', ' class EnclosedClause(CommaClause): """One or more Node objects joined by commas and enclosed in parens.""" parens = True class Window(Node): def __init__(self, partition_by=None, order_by=None): super(Window, self).__init__() self.partition_by = partition_by self.order_by = order_by self._alias = self._alias or 'w' def __sql__(self): over_clauses = [] if self.partition_by: over_clauses.append(Clause( SQL('PARTITION BY'), CommaClause(*self.partition_by))) if self.order_by: over_clauses.append(Clause( SQL('ORDER BY'), CommaClause(*self.order_by))) return EnclosedClause(Clause(*over_clauses)) def clone_base(self): return Window(self.partition_by, self.order_by) def Check(value): return SQL('CHECK (%s)' % value) class DQ(Node): """A "django-style" filter expression, e.g. {'foo__eq': 'x'}.""" def __init__(self, **query): super(DQ, self).__init__() self.query = query def clone_base(self): return DQ(**self.query) class _StripParens(Node): _node_type = 'strip_parens' def __init__(self, node): super(_StripParens, self).__init__() self.node = node JoinMetadata = namedtuple('JoinMetadata', ( 'src_model', # Source Model class. 'dest_model', # Dest Model class. 'src', # Source, may be Model, ModelAlias 'dest', # Dest, may be Model, ModelAlias, or SelectQuery. 'attr', # Attribute name joined instance(s) should be assigned to. 'primary_key', # Primary key being joined on. 'foreign_key', # Foreign key being joined from. 'is_backref', # Is this a backref, i.e. 1 -> N. 'alias', # Explicit alias given to join expression. 'is_self_join', # Is this a self-join? 'is_expression', # Is the join ON clause an Expression? )) class Join(namedtuple('_Join', ('src', 'dest', 'join_type', 'on'))): def get_foreign_key(self, source, dest, field=None): if isinstance(source, SelectQuery) or isinstance(dest, SelectQuery): return None, None fk_field = source._meta.rel_for_model(dest, field) if fk_field is not None: return fk_field, False reverse_rel = source._meta.reverse_rel_for_model(dest, field) if reverse_rel is not None: return reverse_rel, True return None, None def get_join_type(self): return self.join_type or JOIN.INNER def model_from_alias(self, model_or_alias): if isinstance(model_or_alias, ModelAlias): return model_or_alias.model_class elif isinstance(model_or_alias, SelectQuery): return model_or_alias.model_class return model_or_alias def _join_metadata(self): # Get the actual tables being joined. src = self.model_from_alias(self.src) dest = self.model_from_alias(self.dest) join_alias = isinstance(self.on, Node) and self.on._alias or None is_expression = isinstance(self.on, (Expression, Func, SQL)) on_field = isinstance(self.on, (Field, FieldProxy)) and self.on or None if on_field: fk_field = on_field is_backref = on_field.name not in src._meta.fields else: fk_field, is_backref = self.get_foreign_key(src, dest, self.on) if fk_field is None and self.on is not None: fk_field, is_backref = self.get_foreign_key(src, dest) if fk_field is not None: primary_key = fk_field.to_field else: primary_key = None if not join_alias: if fk_field is not None: if is_backref: target_attr = dest._meta.db_table else: target_attr = fk_field.name else: try: target_attr = self.on.lhs.name except AttributeError: target_attr = dest._meta.db_table else: target_attr = None return JoinMetadata( src_model=src, dest_model=dest, src=self.src, dest=self.dest, attr=join_alias or target_attr, primary_key=primary_key, foreign_key=fk_field, is_backref=is_backref, alias=join_alias, is_self_join=src is dest, is_expression=is_expression) @property def metadata(self): if not hasattr(self, '_cached_metadata'): self._cached_metadata = self._join_metadata() return self._cached_metadata class FieldDescriptor(object): # Fields are exposed as descriptors in order to control access to the # underlying "raw" data. def __init__(self, field): self.field = field self.att_name = self.field.name def __get__(self, instance, instance_type=None): if instance is not None: return instance._data.get(self.att_name) return self.field def __set__(self, instance, value): instance._data[self.att_name] = value instance._dirty.add(self.att_name) class Field(Node): """A column on a table.""" _field_counter = 0 _order = 0 _node_type = 'field' db_field = 'unknown' def __init__(self, null=False, index=False, unique=False, verbose_name=None, help_text=None, db_column=None, default=None, choices=None, primary_key=False, sequence=None, constraints=None, schema=None): self.null = null self.index = index self.unique = unique self.verbose_name = verbose_name self.help_text = help_text self.db_column = db_column self.default = default self.choices = choices # Used for metadata purposes, not enforced. self.primary_key = primary_key self.sequence = sequence # Name of sequence, e.g. foo_id_seq. self.constraints = constraints # List of column constraints. self.schema = schema # Name of schema, e.g. 'public'. # Used internally for recovering the order in which Fields were defined # on the Model class. Field._field_counter += 1 self._order = Field._field_counter self._sort_key = (self.primary_key and 1 or 2), self._order self._is_bound = False # Whether the Field is "bound" to a Model. super(Field, self).__init__() def clone_base(self, **kwargs): inst = type(self)( null=self.null, index=self.index, unique=self.unique, verbose_name=self.verbose_name, help_text=self.help_text, db_column=self.db_column, default=self.default, choices=self.choices, primary_key=self.primary_key, sequence=self.sequence, constraints=self.constraints, schema=self.schema, **kwargs) if self._is_bound: inst.name = self.name inst.model_class = self.model_class inst._is_bound = self._is_bound return inst def add_to_class(self, model_class, name): """ Hook that replaces the `Field` attribute on a class with a named `FieldDescriptor`. Called by the metaclass during construction of the `Model`. """ self.name = name self.model_class = model_class self.db_column = self.db_column or self.name if not self.verbose_name: self.verbose_name = re.sub('_+', ' ', name).title() model_class._meta.add_field(self) setattr(model_class, name, FieldDescriptor(self)) self._is_bound = True def get_database(self): return self.model_class._meta.database def get_column_type(self): field_type = self.get_db_field() return self.get_database().compiler().get_column_type(field_type) def get_db_field(self): return self.db_field def get_modifiers(self): return None def coerce(self, value): return value def db_value(self, value): """Convert the python value for storage in the database.""" return value if value is None else self.coerce(value) def python_value(self, value): """Convert the database value to a pythonic value.""" return value if value is None else self.coerce(value) def as_entity(self, with_table=False): if with_table: return Entity(self.model_class._meta.db_table, self.db_column) return Entity(self.db_column) def __ddl_column__(self, column_type): """Return the column type, e.g. VARCHAR(255) or REAL.""" modifiers = self.get_modifiers() if modifiers: return SQL( '%s(%s)' % (column_type, ', '.join(map(str, modifiers)))) return SQL(column_type) def __ddl__(self, column_type): """Return a list of Node instances that defines the column.""" ddl = [self.as_entity(), self.__ddl_column__(column_type)] if not self.null: ddl.append(SQL('NOT NULL')) if self.primary_key: ddl.append(SQL('PRIMARY KEY')) if self.sequence: ddl.append(SQL("DEFAULT NEXTVAL('%s')" % self.sequence)) if self.constraints: ddl.extend(self.constraints) return ddl def __hash__(self): return hash(self.name + '.' + self.model_class.__name__) class BareField(Field): db_field = 'bare' def __init__(self, coerce=None, *args, **kwargs): super(BareField, self).__init__(*args, **kwargs) if coerce is not None: self.coerce = coerce def clone_base(self, **kwargs): return super(BareField, self).clone_base(coerce=self.coerce, **kwargs) class IntegerField(Field): db_field = 'int' coerce = int class BigIntegerField(IntegerField): db_field = 'bigint' class SmallIntegerField(IntegerField): db_field = 'smallint' class PrimaryKeyField(IntegerField): db_field = 'primary_key' def __init__(self, *args, **kwargs): kwargs['primary_key'] = True super(PrimaryKeyField, self).__init__(*args, **kwargs) class _AutoPrimaryKeyField(PrimaryKeyField): _column_name = None def add_to_class(self, model_class, name): if name != self._column_name: raise ValueError('%s must be named `%s`.' % (type(self), name)) super(_AutoPrimaryKeyField, self).add_to_class(model_class, name) class FloatField(Field): db_field = 'float' coerce = float class DoubleField(FloatField): db_field = 'double' class DecimalField(Field): db_field = 'decimal' def __init__(self, max_digits=10, decimal_places=5, auto_round=False, rounding=None, *args, **kwargs): self.max_digits = max_digits self.decimal_places = decimal_places self.auto_round = auto_round self.rounding = rounding or decimal.DefaultContext.rounding super(DecimalField, self).__init__(*args, **kwargs) def clone_base(self, **kwargs): return super(DecimalField, self).clone_base( max_digits=self.max_digits, decimal_places=self.decimal_places, auto_round=self.auto_round, rounding=self.rounding, **kwargs) def get_modifiers(self): return [self.max_digits, self.decimal_places] def db_value(self, value): D = decimal.Decimal if not value: return value if value is None else D(0) if self.auto_round: exp = D(10) ** (-self.decimal_places) rounding = self.rounding return D(str(value)).quantize(exp, rounding=rounding) return value def python_value(self, value): if value is not None: if isinstance(value, decimal.Decimal): return value return decimal.Decimal(str(value)) def coerce_to_unicode(s, encoding='utf-8'): if isinstance(s, unicode_type): return s elif isinstance(s, string_type): return s.decode(encoding) return unicode_type(s) class CharField(Field): db_field = 'string' def __init__(self, max_length=255, *args, **kwargs): self.max_length = max_length super(CharField, self).__init__(*args, **kwargs) def clone_base(self, **kwargs): return super(CharField, self).clone_base( max_length=self.max_length, **kwargs) def get_modifiers(self): return self.max_length and [self.max_length] or None def coerce(self, value): return coerce_to_unicode(value or '') class FixedCharField(CharField): db_field = 'fixed_char' def python_value(self, value): value = super(FixedCharField, self).python_value(value) if value: value = value.strip() return value class TextField(Field): db_field = 'text' def coerce(self, value): return coerce_to_unicode(value or '') class BlobField(Field): db_field = 'blob' _constructor = binary_construct def add_to_class(self, model_class, name): if isinstance(model_class._meta.database, Proxy): model_class._meta.database.attach_callback(self._set_constructor) return super(BlobField, self).add_to_class(model_class, name) def _set_constructor(self, database): self._constructor = database.get_binary_type() def db_value(self, value): if isinstance(value, unicode_type): value = value.encode('raw_unicode_escape') if isinstance(value, basestring): return self._constructor(value) return value class UUIDField(Field): db_field = 'uuid' def db_value(self, value): if isinstance(value, uuid.UUID): return value.hex try: return uuid.UUID(value).hex except: return value def python_value(self, value): if isinstance(value, uuid.UUID): return value return None if value is None else uuid.UUID(value) def _date_part(date_part): def dec(self): return self.model_class._meta.database.extract_date(date_part, self) return dec class _BaseFormattedField(Field): formats = None def __init__(self, formats=None, *args, **kwargs): if formats is not None: self.formats = formats super(_BaseFormattedField, self).__init__(*args, **kwargs) def clone_base(self, **kwargs): return super(_BaseFormattedField, self).clone_base( formats=self.formats, **kwargs) class DateTimeField(_BaseFormattedField): db_field = 'datetime' formats = [ '%Y-%m-%d %H:%M:%S.%f', '%Y-%m-%d %H:%M:%S', '%Y-%m-%d', ] def python_value(self, value): if value and isinstance(value, basestring): return format_date_time(value, self.formats) return value year = property(_date_part('year')) month = property(_date_part('month')) day = property(_date_part('day')) hour = property(_date_part('hour')) minute = property(_date_part('minute')) second = property(_date_part('second')) class DateField(_BaseFormattedField): db_field = 'date' formats = [ '%Y-%m-%d', '%Y-%m-%d %H:%M:%S', '%Y-%m-%d %H:%M:%S.%f', ] def python_value(self, value): if value and isinstance(value, basestring): pp = lambda x: x.date() return format_date_time(value, self.formats, pp) elif value and isinstance(value, datetime.datetime): return value.date() return value year = property(_date_part('year')) month = property(_date_part('month')) day = property(_date_part('day')) class TimeField(_BaseFormattedField): db_field = 'time' formats = [ '%H:%M:%S.%f', '%H:%M:%S', '%H:%M', '%Y-%m-%d %H:%M:%S.%f', '%Y-%m-%d %H:%M:%S', ] def python_value(self, value): if value: if isinstance(value, basestring): pp = lambda x: x.time() return format_date_time(value, self.formats, pp) elif isinstance(value, datetime.datetime): return value.time() if value is not None and isinstance(value, datetime.timedelta): return (datetime.datetime.min + value).time() return value hour = property(_date_part('hour')) minute = property(_date_part('minute')) second = property(_date_part('second')) class TimestampField(IntegerField): # Support second -> microsecond resolution. valid_resolutions = [10**i for i in range(7)] def __init__(self, *args, **kwargs): self.resolution = kwargs.pop('resolution', 1) or 1 if self.resolution not in self.valid_resolutions: raise ValueError('TimestampField resolution must be one of: %s' % ', '.join(str(i) for i in self.valid_resolutions)) self.utc = kwargs.pop('utc', False) or False _dt = datetime.datetime self._conv = _dt.utcfromtimestamp if self.utc else _dt.fromtimestamp _default = _dt.utcnow if self.utc else _dt.now kwargs.setdefault('default', _default) super(TimestampField, self).__init__(*args, **kwargs) def get_db_field(self): # For second resolution we can get away (for a while) with using # 4 bytes to store the timestamp (as long as they're not > ~2038). # Otherwise we'll need to use a BigInteger type. return (self.db_field if self.resolution == 1 else BigIntegerField.db_field) def db_value(self, value): if value is None: return if isinstance(value, datetime.datetime): pass elif isinstance(value, datetime.date): value = datetime.datetime(value.year, value.month, value.day) else: return int(round(value * self.resolution)) if self.utc: timestamp = calendar.timegm(value.utctimetuple()) else: timestamp = time.mktime(value.timetuple()) timestamp += (value.microsecond * .000001) if self.resolution > 1: timestamp *= self.resolution return int(round(timestamp)) def python_value(self, value): if value is not None and isinstance(value, (int, float, long)): if value == 0: return elif self.resolution > 1: value, microseconds = divmod(value, self.resolution) return self._conv(value).replace(microsecond=microseconds) else: return self._conv(value) return value class BooleanField(Field): db_field = 'bool' coerce = bool class RelationDescriptor(FieldDescriptor): """Foreign-key abstraction to replace a related PK with a related model.""" def __init__(self, field, rel_model): self.rel_model = rel_model super(RelationDescriptor, self).__init__(field) def get_object_or_id(self, instance): rel_id = instance._data.get(self.att_name) if rel_id is not None or self.att_name in instance._obj_cache: if self.att_name not in instance._obj_cache: obj = self.rel_model.get(self.field.to_field == rel_id) instance._obj_cache[self.att_name] = obj return instance._obj_cache[self.att_name] elif not self.field.null: raise self.rel_model.DoesNotExist return rel_id def __get__(self, instance, instance_type=None): if instance is not None: return self.get_object_or_id(instance) return self.field def __set__(self, instance, value): if isinstance(value, self.rel_model): instance._data[self.att_name] = getattr( value, self.field.to_field.name) instance._obj_cache[self.att_name] = value else: orig_value = instance._data.get(self.att_name) instance._data[self.att_name] = value if orig_value != value and self.att_name in instance._obj_cache: del instance._obj_cache[self.att_name] instance._dirty.add(self.att_name) class ReverseRelationDescriptor(object): """Back-reference to expose related objects as a `SelectQuery`.""" def __init__(self, field): self.field = field self.rel_model = field.model_class def __get__(self, instance, instance_type=None): if instance is not None: return self.rel_model.select().where( self.field == getattr(instance, self.field.to_field.name)) return self class ObjectIdDescriptor(object): """Gives direct access to the underlying id""" def __init__(self, field): self.attr_name = field.name self.field = weakref.ref(field) def __get__(self, instance, instance_type=None): if instance is not None: return instance._data.get(self.attr_name) return self.field() def __set__(self, instance, value): setattr(instance, self.attr_name, value) class ForeignKeyField(IntegerField): def __init__(self, rel_model, related_name=None, on_delete=None, on_update=None, extra=None, to_field=None, *args, **kwargs): if rel_model != 'self' and not \ isinstance(rel_model, (Proxy, DeferredRelation)) and not \ issubclass(rel_model, Model): raise TypeError('Unexpected value for `rel_model`. Expected ' '`Model`, `Proxy`, `DeferredRelation`, or "self"') self.rel_model = rel_model self._related_name = related_name self.deferred = isinstance(rel_model, (Proxy, DeferredRelation)) self.on_delete = on_delete self.on_update = on_update self.extra = extra self.to_field = to_field super(ForeignKeyField, self).__init__(*args, **kwargs) def clone_base(self, **kwargs): return super(ForeignKeyField, self).clone_base( rel_model=self.rel_model, related_name=self._get_related_name(), on_delete=self.on_delete, on_update=self.on_update, extra=self.extra, to_field=self.to_field, **kwargs) def _get_descriptor(self): return RelationDescriptor(self, self.rel_model) def _get_id_descriptor(self): return ObjectIdDescriptor(self) def _get_backref_descriptor(self): return ReverseRelationDescriptor(self) def _get_related_name(self): if self._related_name and callable(self._related_name): return self._related_name(self) return self._related_name or ('%s_set' % self.model_class._meta.name) def add_to_class(self, model_class, name): if isinstance(self.rel_model, Proxy): def callback(rel_model): self.rel_model = rel_model self.add_to_class(model_class, name) self.rel_model.attach_callback(callback) return elif isinstance(self.rel_model, DeferredRelation): self.rel_model.set_field(model_class, self, name) return self.name = name self.model_class = model_class self.db_column = obj_id_name = self.db_column or '%s_id' % self.name if obj_id_name == self.name: obj_id_name += '_id' if not self.verbose_name: self.verbose_name = re.sub('_+', ' ', name).title() model_class._meta.add_field(self) self.related_name = self._get_related_name() if self.rel_model == 'self': self.rel_model = self.model_class if self.to_field is not None: if not isinstance(self.to_field, Field): self.to_field = getattr(self.rel_model, self.to_field) else: self.to_field = self.rel_model._meta.primary_key # TODO: factor into separate method. if model_class._meta.validate_backrefs: def invalid(msg, **context): context.update( field='%s.%s' % (model_class._meta.name, name), backref=self.related_name, obj_id_name=obj_id_name) raise AttributeError(msg % context) if self.related_name in self.rel_model._meta.fields: invalid('The related_name of %(field)s ("%(backref)s") ' 'conflicts with a field of the same name.') elif self.related_name in self.rel_model._meta.reverse_rel: invalid('The related_name of %(field)s ("%(backref)s") ' 'is already in use by another foreign key.') if obj_id_name in model_class._meta.fields: invalid('The object id descriptor of %(field)s conflicts ' 'with a field named %(obj_id_name)s') elif obj_id_name in model_class.__dict__: invalid('Model attribute "%(obj_id_name)s" would be shadowed ' 'by the object id descriptor of %(field)s.') setattr(model_class, name, self._get_descriptor()) setattr(model_class, obj_id_name, self._get_id_descriptor()) setattr(self.rel_model, self.related_name, self._get_backref_descriptor()) self._is_bound = True model_class._meta.rel[self.name] = self self.rel_model._meta.reverse_rel[self.related_name] = self def get_db_field(self): """ Overridden to ensure Foreign Keys use same column type as the primary key they point to. """ if not isinstance(self.to_field, PrimaryKeyField): return self.to_field.get_db_field() return super(ForeignKeyField, self).get_db_field() def get_modifiers(self): if not isinstance(self.to_field, PrimaryKeyField): return self.to_field.get_modifiers() return super(ForeignKeyField, self).get_modifiers() def coerce(self, value): return self.to_field.coerce(value) def db_value(self, value): if isinstance(value, self.rel_model): value = value._get_pk_value() return self.to_field.db_value(value) def python_value(self, value): if isinstance(value, self.rel_model): return value return self.to_field.python_value(value) class CompositeKey(object): """A primary key composed of multiple columns.""" sequence = None def __init__(self, *field_names): self.field_names = field_names def add_to_class(self, model_class, name): self.name = name self.model_class = model_class setattr(model_class, name, self) def __get__(self, instance, instance_type=None): if instance is not None: return tuple([getattr(instance, field_name) for field_name in self.field_names]) return self def __set__(self, instance, value): pass def __eq__(self, other): expressions = [(self.model_class._meta.fields[field] == value) for field, value in zip(self.field_names, other)] return reduce(operator.and_, expressions) def __hash__(self): return hash((self.model_class.__name__, self.field_names)) class AliasMap(object): prefix = 't' def __init__(self, start=0): self._alias_map = {} self._counter = start def __repr__(self): return '' % self._alias_map def add(self, obj, alias=None): if obj in self._alias_map: return self._counter += 1 self._alias_map[obj] = alias or '%s%s' % (self.prefix, self._counter) def __getitem__(self, obj): if obj not in self._alias_map: self.add(obj) return self._alias_map[obj] def __contains__(self, obj): return obj in self._alias_map def update(self, alias_map): if alias_map: for obj, alias in alias_map._alias_map.items(): if obj not in self: self._alias_map[obj] = alias return self class QueryCompiler(object): # Mapping of `db_type` to actual column type used by database driver. # Database classes may provide additional column types or overrides. field_map = { 'bare': '', 'bigint': 'BIGINT', 'blob': 'BLOB', 'bool': 'SMALLINT', 'date': 'DATE', 'datetime': 'DATETIME', 'decimal': 'DECIMAL', 'double': 'REAL', 'fixed_char': 'CHAR', 'float': 'REAL', 'int': 'INTEGER', 'primary_key': 'INTEGER', 'smallint': 'SMALLINT', 'string': 'VARCHAR', 'text': 'TEXT', 'time': 'TIME', } # Mapping of OP. to actual SQL operation. For most databases this will be # the same, but some column types or databases may support additional ops. # Like `field_map`, Database classes may extend or override these. op_map = { OP.EQ: '=', OP.LT: '<', OP.LTE: '<=', OP.GT: '>', OP.GTE: '>=', OP.NE: '!=', OP.IN: 'IN', OP.NOT_IN: 'NOT IN', OP.IS: 'IS', OP.IS_NOT: 'IS NOT', OP.BIN_AND: '&', OP.BIN_OR: '|', OP.LIKE: 'LIKE', OP.ILIKE: 'ILIKE', OP.BETWEEN: 'BETWEEN', OP.ADD: '+', OP.SUB: '-', OP.MUL: '*', OP.DIV: '/', OP.XOR: '#', OP.AND: 'AND', OP.OR: 'OR', OP.MOD: '%', OP.REGEXP: 'REGEXP', OP.CONCAT: '||', } join_map = { JOIN.INNER: 'INNER JOIN', JOIN.LEFT_OUTER: 'LEFT OUTER JOIN', JOIN.RIGHT_OUTER: 'RIGHT OUTER JOIN', JOIN.FULL: 'FULL JOIN', } alias_map_class = AliasMap def __init__(self, quote_char='"', interpolation='?', field_overrides=None, op_overrides=None): self.quote_char = quote_char self.interpolation = interpolation self._field_map = merge_dict(self.field_map, field_overrides or {}) self._op_map = merge_dict(self.op_map, op_overrides or {}) self._parse_map = self.get_parse_map() self._unknown_types = set(['param']) def get_parse_map(self): # To avoid O(n) lookups when parsing nodes, use a lookup table for # common node types O(1). return { 'expression': self._parse_expression, 'param': self._parse_param, 'passthrough': self._parse_passthrough, 'func': self._parse_func, 'clause': self._parse_clause, 'entity': self._parse_entity, 'field': self._parse_field, 'sql': self._parse_sql, 'select_query': self._parse_select_query, 'compound_select_query': self._parse_compound_select_query, 'strip_parens': self._parse_strip_parens, } def quote(self, s): return '%s%s%s' % (self.quote_char, s, self.quote_char) def get_column_type(self, f): return self._field_map[f] if f in self._field_map else f.upper() def get_op(self, q): return self._op_map[q] def _sorted_fields(self, field_dict): return sorted(field_dict.items(), key=lambda i: i[0]._sort_key) def _parse_default(self, node, alias_map, conv): return self.interpolation, [node] def _parse_expression(self, node, alias_map, conv): if isinstance(node.lhs, Field): conv = node.lhs lhs, lparams = self.parse_node(node.lhs, alias_map, conv) rhs, rparams = self.parse_node(node.rhs, alias_map, conv) template = '%s %s %s' if node.flat else '(%s %s %s)' sql = template % (lhs, self.get_op(node.op), rhs) return sql, lparams + rparams def _parse_passthrough(self, node, alias_map, conv): if node.adapt: return self.parse_node(node.adapt(node.value), alias_map, None) return self.interpolation, [node.value] def _parse_param(self, node, alias_map, conv): if node.adapt: if conv and conv.db_value is node.adapt: conv = None return self.parse_node(node.adapt(node.value), alias_map, conv) elif conv is not None: return self.parse_node(conv.db_value(node.value), alias_map) else: return self.interpolation, [node.value] def _parse_func(self, node, alias_map, conv): conv = node._coerce and conv or None sql, params = self.parse_node_list(node.arguments, alias_map, conv) return '%s(%s)' % (node.name, strip_parens(sql)), params def _parse_clause(self, node, alias_map, conv): sql, params = self.parse_node_list( node.nodes, alias_map, conv, node.glue) if node.parens: sql = '(%s)' % strip_parens(sql) return sql, params def _parse_entity(self, node, alias_map, conv): return '.'.join(map(self.quote, node.path)), [] def _parse_sql(self, node, alias_map, conv): return node.value, list(node.params) def _parse_field(self, node, alias_map, conv): if alias_map: sql = '.'.join(( self.quote(alias_map[node.model_class]), self.quote(node.db_column))) else: sql = self.quote(node.db_column) return sql, [] def _parse_compound_select_query(self, node, alias_map, conv): csq = 'compound_select_query' if node.rhs._node_type == csq and node.lhs._node_type != csq: first_q, second_q = node.rhs, node.lhs inv = True else: first_q, second_q = node.lhs, node.rhs inv = False new_map = self.alias_map_class() if first_q._node_type == csq: new_map._counter = alias_map._counter first, first_p = self.generate_select(first_q, new_map) second, second_p = self.generate_select( second_q, self.calculate_alias_map(second_q, new_map)) if inv: l, lp, r, rp = second, second_p, first, first_p else: l, lp, r, rp = first, first_p , second, second_p # We add outer parentheses in the event the compound query is used in # the `from_()` clause, in which case we'll need them. if node.database.compound_select_parentheses: sql = '((%s) %s (%s))' % (l, node.operator, r) else: sql = '(%s %s %s)' % (l, node.operator, r) return sql, lp + rp def _parse_select_query(self, node, alias_map, conv): clone = node.clone() if not node._explicit_selection: if conv and isinstance(conv, ForeignKeyField): select_field = conv.to_field else: select_field = clone.model_class._meta.primary_key clone._select = (select_field,) sub, params = self.generate_select(clone, alias_map) return '(%s)' % strip_parens(sub), params def _parse_strip_parens(self, node, alias_map, conv): sql, params = self.parse_node(node.node, alias_map, conv) return strip_parens(sql), params def _parse(self, node, alias_map, conv): # By default treat the incoming node as a raw value that should be # parameterized. node_type = getattr(node, '_node_type', None) unknown = False if node_type in self._parse_map: sql, params = self._parse_map[node_type](node, alias_map, conv) unknown = (node_type in self._unknown_types and node.adapt is None and conv is None) elif isinstance(node, (list, tuple, set)): # If you're wondering how to pass a list into your query, simply # wrap it in Param(). sql, params = self.parse_node_list(node, alias_map, conv) sql = '(%s)' % sql elif isinstance(node, Model): sql = self.interpolation if conv and isinstance(conv, ForeignKeyField): to_field = conv.to_field if isinstance(to_field, ForeignKeyField): value = conv.db_value(node) else: value = to_field.db_value(getattr(node, to_field.name)) else: value = node._get_pk_value() params = [value] elif (isclass(node) and issubclass(node, Model)) or \ isinstance(node, ModelAlias): entity = node.as_entity().alias(alias_map[node]) sql, params = self.parse_node(entity, alias_map, conv) elif conv is not None: value = conv.db_value(node) sql, params, _ = self._parse(value, alias_map, None) else: sql, params = self._parse_default(node, alias_map, None) unknown = True return sql, params, unknown def parse_node(self, node, alias_map=None, conv=None): sql, params, unknown = self._parse(node, alias_map, conv) if unknown and (conv is not None) and params: params = [conv.db_value(i) for i in params] if isinstance(node, Node): if node._negated: sql = 'NOT %s' % sql if node._alias: sql = ' '.join((sql, 'AS', node._alias)) if node._ordering: sql = ' '.join((sql, node._ordering)) if params and any(isinstance(p, Node) for p in params): clean_params = [] clean_sql = [] for idx, param in enumerate(params): if isinstance(param, Node): csql, cparams = self.parse_node(param) return sql, params def parse_node_list(self, nodes, alias_map, conv=None, glue=', '): sql = [] params = [] for node in nodes: node_sql, node_params = self.parse_node(node, alias_map, conv) sql.append(node_sql) params.extend(node_params) return glue.join(sql), params def calculate_alias_map(self, query, alias_map=None): new_map = self.alias_map_class() if alias_map is not None: new_map._counter = alias_map._counter new_map.add(query.model_class, query.model_class._meta.table_alias) for src_model, joined_models in query._joins.items(): new_map.add(src_model, src_model._meta.table_alias) for join_obj in joined_models: if isinstance(join_obj.dest, Node): new_map.add(join_obj.dest, join_obj.dest.alias) else: new_map.add(join_obj.dest, join_obj.dest._meta.table_alias) return new_map.update(alias_map) def build_query(self, clauses, alias_map=None): return self.parse_node(Clause(*clauses), alias_map) def generate_joins(self, joins, model_class, alias_map): # Joins are implemented as an adjancency-list graph. Perform a # depth-first search of the graph to generate all the necessary JOINs. clauses = [] seen = set() q = [model_class] while q: curr = q.pop() if curr not in joins or curr in seen: continue seen.add(curr) for join in joins[curr]: src = curr dest = join.dest if isinstance(join.on, (Expression, Func, Clause, Entity)): # Clear any alias on the join expression. constraint = join.on.clone().alias() else: metadata = join.metadata if metadata.is_backref: fk_model = join.dest pk_model = join.src else: fk_model = join.src pk_model = join.dest fk = metadata.foreign_key if fk: lhs = getattr(fk_model, fk.name) rhs = getattr(pk_model, fk.to_field.name) if metadata.is_backref: lhs, rhs = rhs, lhs constraint = (lhs == rhs) else: raise ValueError('Missing required join predicate.') if isinstance(dest, Node): # TODO: ensure alias? dest_n = dest else: q.append(dest) dest_n = dest.as_entity().alias(alias_map[dest]) join_type = join.get_join_type() if join_type in self.join_map: join_sql = SQL(self.join_map[join_type]) else: join_sql = SQL(join_type) clauses.append( Clause(join_sql, dest_n, SQL('ON'), constraint)) return clauses def generate_select(self, query, alias_map=None): model = query.model_class db = model._meta.database alias_map = self.calculate_alias_map(query, alias_map) if isinstance(query, CompoundSelect): clauses = [_StripParens(query)] else: if not query._distinct: clauses = [SQL('SELECT')] else: clauses = [SQL('SELECT DISTINCT')] if query._distinct not in (True, False): clauses += [SQL('ON'), EnclosedClause(*query._distinct)] select_clause = Clause(*query._select) select_clause.glue = ', ' clauses.extend((select_clause, SQL('FROM'))) if query._from is None: clauses.append(model.as_entity().alias(alias_map[model])) else: clauses.append(CommaClause(*query._from)) if query._windows is not None: clauses.append(SQL('WINDOW')) clauses.append(CommaClause(*[ Clause( SQL(window._alias), SQL('AS'), window.__sql__()) for window in query._windows])) join_clauses = self.generate_joins(query._joins, model, alias_map) if join_clauses: clauses.extend(join_clauses) if query._where is not None: clauses.extend([SQL('WHERE'), query._where]) if query._group_by: clauses.extend([SQL('GROUP BY'), CommaClause(*query._group_by)]) if query._having: clauses.extend([SQL('HAVING'), query._having]) if query._order_by: clauses.extend([SQL('ORDER BY'), CommaClause(*query._order_by)]) if query._limit is not None or (query._offset and db.limit_max): limit = query._limit if query._limit is not None else db.limit_max clauses.append(SQL('LIMIT %s' % limit)) if query._offset is not None: clauses.append(SQL('OFFSET %s' % query._offset)) for_update, no_wait = query._for_update if for_update: stmt = 'FOR UPDATE NOWAIT' if no_wait else 'FOR UPDATE' clauses.append(SQL(stmt)) return self.build_query(clauses, alias_map) def generate_update(self, query): model = query.model_class alias_map = self.alias_map_class() alias_map.add(model, model._meta.db_table) if query._on_conflict: statement = 'UPDATE OR %s' % query._on_conflict else: statement = 'UPDATE' clauses = [SQL(statement), model.as_entity(), SQL('SET')] update = [] for field, value in self._sorted_fields(query._update): if not isinstance(value, (Node, Model)): value = Param(value, adapt=field.db_value) update.append(Expression( field.as_entity(with_table=False), OP.EQ, value, flat=True)) # No outer parens, no table alias. clauses.append(CommaClause(*update)) if query._where: clauses.extend([SQL('WHERE'), query._where]) if query._returning is not None: returning_clause = Clause(*query._returning) returning_clause.glue = ', ' clauses.extend([SQL('RETURNING'), returning_clause]) return self.build_query(clauses, alias_map) def _get_field_clause(self, fields, clause_type=EnclosedClause): return clause_type(*[ field.as_entity(with_table=False) for field in fields]) def generate_insert(self, query): model = query.model_class meta = model._meta alias_map = self.alias_map_class() alias_map.add(model, model._meta.db_table) if query._upsert: statement = meta.database.upsert_sql elif query._on_conflict: statement = 'INSERT OR %s INTO' % query._on_conflict else: statement = 'INSERT INTO' clauses = [SQL(statement), model.as_entity()] if query._query is not None: # This INSERT query is of the form INSERT INTO ... SELECT FROM. if query._fields: clauses.append(self._get_field_clause(query._fields)) clauses.append(_StripParens(query._query)) elif query._rows is not None: fields, value_clauses = [], [] have_fields = False for row_dict in query._iter_rows(): if not have_fields: fields = sorted( row_dict.keys(), key=operator.attrgetter('_sort_key')) have_fields = True values = [] for field in fields: value = row_dict[field] if not isinstance(value, (Node, Model)): value = Param(value, adapt=field.db_value) values.append(value) value_clauses.append(EnclosedClause(*values)) if fields: clauses.extend([ self._get_field_clause(fields), SQL('VALUES'), CommaClause(*value_clauses)]) elif query.model_class._meta.auto_increment: # Bare insert, use default value for primary key. clauses.append(query.database.default_insert_clause( query.model_class)) if query.is_insert_returning: clauses.extend([ SQL('RETURNING'), self._get_field_clause( meta.get_primary_key_fields(), clause_type=CommaClause)]) elif query._returning is not None: returning_clause = Clause(*query._returning) returning_clause.glue = ', ' clauses.extend([SQL('RETURNING'), returning_clause]) return self.build_query(clauses, alias_map) def generate_delete(self, query): model = query.model_class clauses = [SQL('DELETE FROM'), model.as_entity()] if query._where: clauses.extend([SQL('WHERE'), query._where]) if query._returning is not None: returning_clause = Clause(*query._returning) returning_clause.glue = ', ' clauses.extend([SQL('RETURNING'), returning_clause]) return self.build_query(clauses) def field_definition(self, field): column_type = self.get_column_type(field.get_db_field()) ddl = field.__ddl__(column_type) return Clause(*ddl) def foreign_key_constraint(self, field): ddl = [ SQL('FOREIGN KEY'), EnclosedClause(field.as_entity()), SQL('REFERENCES'), field.rel_model.as_entity(), EnclosedClause(field.to_field.as_entity())] if field.on_delete: ddl.append(SQL('ON DELETE %s' % field.on_delete)) if field.on_update: ddl.append(SQL('ON UPDATE %s' % field.on_update)) return Clause(*ddl) def return_parsed_node(function_name): # TODO: treat all `generate_` functions as returning clauses, instead # of SQL/params. def inner(self, *args, **kwargs): fn = getattr(self, function_name) return self.parse_node(fn(*args, **kwargs)) return inner def _create_foreign_key(self, model_class, field, constraint=None): constraint = constraint or 'fk_%s_%s_refs_%s' % ( model_class._meta.db_table, field.db_column, field.rel_model._meta.db_table) fk_clause = self.foreign_key_constraint(field) return Clause( SQL('ALTER TABLE'), model_class.as_entity(), SQL('ADD CONSTRAINT'), Entity(constraint), *fk_clause.nodes) create_foreign_key = return_parsed_node('_create_foreign_key') def _create_table(self, model_class, safe=False): statement = 'CREATE TABLE IF NOT EXISTS' if safe else 'CREATE TABLE' meta = model_class._meta columns, constraints = [], [] if meta.composite_key: pk_cols = [meta.fields[f].as_entity() for f in meta.primary_key.field_names] constraints.append(Clause( SQL('PRIMARY KEY'), EnclosedClause(*pk_cols))) for field in meta.declared_fields: columns.append(self.field_definition(field)) if isinstance(field, ForeignKeyField) and not field.deferred: constraints.append(self.foreign_key_constraint(field)) if model_class._meta.constraints: for constraint in model_class._meta.constraints: if not isinstance(constraint, Node): constraint = SQL(constraint) constraints.append(constraint) return Clause( SQL(statement), model_class.as_entity(), EnclosedClause(*(columns + constraints))) create_table = return_parsed_node('_create_table') def _drop_table(self, model_class, fail_silently=False, cascade=False): statement = 'DROP TABLE IF EXISTS' if fail_silently else 'DROP TABLE' ddl = [SQL(statement), model_class.as_entity()] if cascade: ddl.append(SQL('CASCADE')) return Clause(*ddl) drop_table = return_parsed_node('_drop_table') def _truncate_table(self, model_class, restart_identity=False, cascade=False): ddl = [SQL('TRUNCATE TABLE'), model_class.as_entity()] if restart_identity: ddl.append(SQL('RESTART IDENTITY')) if cascade: ddl.append(SQL('CASCADE')) return Clause(*ddl) truncate_table = return_parsed_node('_truncate_table') def index_name(self, table, columns): index = '%s_%s' % (table, '_'.join(columns)) if len(index) > 64: index_hash = hashlib.md5(index.encode('utf-8')).hexdigest() index = '%s_%s' % (table[:55], index_hash[:8]) # 55 + 1 + 8 = 64 return index def _create_index(self, model_class, fields, unique, *extra): tbl_name = model_class._meta.db_table statement = 'CREATE UNIQUE INDEX' if unique else 'CREATE INDEX' index_name = self.index_name(tbl_name, [f.db_column for f in fields]) return Clause( SQL(statement), Entity(index_name), SQL('ON'), model_class.as_entity(), EnclosedClause(*[field.as_entity() for field in fields]), *extra) create_index = return_parsed_node('_create_index') def _drop_index(self, model_class, fields, fail_silently=False): tbl_name = model_class._meta.db_table statement = 'DROP INDEX IF EXISTS' if fail_silently else 'DROP INDEX' index_name = self.index_name(tbl_name, [f.db_column for f in fields]) return Clause(SQL(statement), Entity(index_name)) drop_index = return_parsed_node('_drop_index') def _create_sequence(self, sequence_name): return Clause(SQL('CREATE SEQUENCE'), Entity(sequence_name)) create_sequence = return_parsed_node('_create_sequence') def _drop_sequence(self, sequence_name): return Clause(SQL('DROP SEQUENCE'), Entity(sequence_name)) drop_sequence = return_parsed_node('_drop_sequence') class SqliteQueryCompiler(QueryCompiler): def truncate_table(self, model_class, restart_identity=False, cascade=False): return model_class.delete().sql() class ResultIterator(object): def __init__(self, qrw): self.qrw = qrw self._idx = 0 def next(self): if self._idx < self.qrw._ct: obj = self.qrw._result_cache[self._idx] elif not self.qrw._populated: obj = self.qrw.iterate() self.qrw._result_cache.append(obj) self.qrw._ct += 1 else: raise StopIteration self._idx += 1 return obj __next__ = next class QueryResultWrapper(object): """ Provides an iterator over the results of a raw Query, additionally doing two things: - converts rows from the database into python representations - ensures that multiple iterations do not result in multiple queries """ def __init__(self, model, cursor, meta=None): self.model = model self.cursor = cursor self._ct = 0 self._idx = 0 self._result_cache = [] self._populated = False self._initialized = False if meta is not None: self.column_meta, self.join_meta = meta else: self.column_meta = self.join_meta = None def __iter__(self): if self._populated: return iter(self._result_cache) else: return ResultIterator(self) @property def count(self): self.fill_cache() return self._ct def __len__(self): return self.count def process_row(self, row): return row def iterate(self): row = self.cursor.fetchone() if not row: self._populated = True if not getattr(self.cursor, 'name', None): self.cursor.close() raise StopIteration elif not self._initialized: self.initialize(self.cursor.description) self._initialized = True return self.process_row(row) def iterator(self): while True: yield self.iterate() def next(self): if self._idx < self._ct: inst = self._result_cache[self._idx] self._idx += 1 return inst elif self._populated: raise StopIteration obj = self.iterate() self._result_cache.append(obj) self._ct += 1 self._idx += 1 return obj __next__ = next def fill_cache(self, n=None): n = n or float('Inf') if n < 0: raise ValueError('Negative values are not supported.') self._idx = self._ct while not self._populated and (n > self._ct): try: next(self) except StopIteration: break class ExtQueryResultWrapper(QueryResultWrapper): def initialize(self, description): model = self.model conv = [] identity = lambda x: x for i in range(len(description)): func = identity column = description[i][0] found = False if self.column_meta is not None: try: select_column = self.column_meta[i] except IndexError: pass else: if isinstance(select_column, Field): func = select_column.python_value column = select_column._alias or select_column.name found = True elif (isinstance(select_column, Func) and len(select_column.arguments) and isinstance(select_column.arguments[0], Field)): if select_column._coerce: # Special-case handling aggregations. func = select_column.arguments[0].python_value found = True if not found and column in model._meta.columns: field_obj = model._meta.columns[column] column = field_obj.name func = field_obj.python_value conv.append((i, column, func)) self.conv = conv class TuplesQueryResultWrapper(ExtQueryResultWrapper): def process_row(self, row): return tuple([self.conv[i][2](col) for i, col in enumerate(row)]) if _TuplesQueryResultWrapper is None: _TuplesQueryResultWrapper = TuplesQueryResultWrapper class NaiveQueryResultWrapper(ExtQueryResultWrapper): def process_row(self, row): instance = self.model() for i, column, func in self.conv: setattr(instance, column, func(row[i])) instance._prepare_instance() return instance if _ModelQueryResultWrapper is None: _ModelQueryResultWrapper = NaiveQueryResultWrapper class DictQueryResultWrapper(ExtQueryResultWrapper): def process_row(self, row): res = {} for i, column, func in self.conv: res[column] = func(row[i]) return res if _DictQueryResultWrapper is None: _DictQueryResultWrapper = DictQueryResultWrapper class ModelQueryResultWrapper(QueryResultWrapper): def initialize(self, description): self.column_map, model_set = self.generate_column_map() self._col_set = set(col for col in self.column_meta if isinstance(col, Field)) self.join_list = self.generate_join_list(model_set) def generate_column_map(self): column_map = [] models = set([self.model]) for i, node in enumerate(self.column_meta): attr = conv = None if isinstance(node, Field): if isinstance(node, FieldProxy): key = node._model_alias constructor = node.model conv = node.field_instance.python_value else: key = constructor = node.model_class conv = node.python_value attr = node._alias or node.name else: if node._bind_to is None: key = constructor = self.model else: key = constructor = node._bind_to if isinstance(node, Node) and node._alias: attr = node._alias elif isinstance(node, Entity): attr = node.path[-1] column_map.append((key, constructor, attr, conv)) models.add(key) return column_map, models def generate_join_list(self, models): join_list = [] joins = self.join_meta stack = [self.model] while stack: current = stack.pop() if current not in joins: continue for join in joins[current]: metadata = join.metadata if metadata.dest in models or metadata.dest_model in models: if metadata.foreign_key is not None: fk_present = metadata.foreign_key in self._col_set pk_present = metadata.primary_key in self._col_set check = metadata.foreign_key.null and (fk_present or pk_present) else: check = fk_present = pk_present = False join_list.append(( metadata, check, fk_present, pk_present)) stack.append(join.dest) return join_list def process_row(self, row): collected = self.construct_instances(row) instances = self.follow_joins(collected) for i in instances: i._prepare_instance() return instances[0] def construct_instances(self, row, keys=None): collected_models = {} for i, (key, constructor, attr, conv) in enumerate(self.column_map): if keys is not None and key not in keys: continue value = row[i] if key not in collected_models: collected_models[key] = constructor() instance = collected_models[key] if attr is None: attr = self.cursor.description[i][0] if conv is not None: value = conv(value) setattr(instance, attr, value) return collected_models def follow_joins(self, collected): prepared = [collected[self.model]] for (metadata, check_null, fk_present, pk_present) in self.join_list: inst = collected[metadata.src] try: joined_inst = collected[metadata.dest] except KeyError: joined_inst = collected[metadata.dest_model] has_fk = True if check_null: if fk_present: has_fk = inst._data.get(metadata.foreign_key.name) elif pk_present: has_fk = joined_inst._data.get(metadata.primary_key.name) if not has_fk: continue # Can we populate a value on the joined instance using the current? mpk = metadata.primary_key is not None can_populate_joined_pk = ( mpk and (metadata.attr in inst._data) and (getattr(joined_inst, metadata.primary_key.name) is None)) if can_populate_joined_pk: setattr( joined_inst, metadata.primary_key.name, inst._data[metadata.attr]) if metadata.is_backref: can_populate_joined_fk = ( mpk and (metadata.foreign_key is not None) and (getattr(inst, metadata.primary_key.name) is not None) and (joined_inst._data.get(metadata.foreign_key.name) is None)) if can_populate_joined_fk: setattr( joined_inst, metadata.foreign_key.name, inst) setattr(inst, metadata.attr, joined_inst) prepared.append(joined_inst) return prepared JoinCache = namedtuple('JoinCache', ('metadata', 'attr')) class AggregateQueryResultWrapper(ModelQueryResultWrapper): def __init__(self, *args, **kwargs): self._row = [] super(AggregateQueryResultWrapper, self).__init__(*args, **kwargs) def initialize(self, description): super(AggregateQueryResultWrapper, self).initialize(description) # Collect the set of all models (and ModelAlias objects) queried. self.all_models = set() for key, _, _, _ in self.column_map: self.all_models.add(key) # Prepare data structures for analyzing unique rows. Also cache # foreign key and attribute names for joined models. self.models_with_aggregate = set() self.back_references = {} self.source_to_dest = {} self.dest_to_source = {} for (metadata, _, _, _) in self.join_list: if metadata.is_backref: att_name = metadata.foreign_key.related_name else: att_name = metadata.attr is_backref = metadata.is_backref or metadata.is_self_join if is_backref: self.models_with_aggregate.add(metadata.src) else: self.dest_to_source.setdefault(metadata.dest, set()) self.dest_to_source[metadata.dest].add(metadata.src) self.source_to_dest.setdefault(metadata.src, {}) self.source_to_dest[metadata.src][metadata.dest] = JoinCache( metadata=metadata, attr=metadata.alias or att_name) # Determine which columns could contain "duplicate" data, e.g. if # getting Users and their Tweets, this would be the User columns. self.columns_to_compare = {} key_to_columns = {} for idx, (key, model_class, col_name, _) in enumerate(self.column_map): if key in self.models_with_aggregate: self.columns_to_compare.setdefault(key, []) self.columns_to_compare[key].append((idx, col_name)) key_to_columns.setdefault(key, []) key_to_columns[key].append((idx, col_name)) # Also compare columns for joins -> many-related model. for model_or_alias in self.models_with_aggregate: if model_or_alias not in self.columns_to_compare: continue sources = self.dest_to_source.get(model_or_alias, ()) for joined_model in sources: self.columns_to_compare[model_or_alias].extend( key_to_columns[joined_model]) def read_model_data(self, row): models = {} for model_class, column_data in self.columns_to_compare.items(): models[model_class] = [] for idx, col_name in column_data: models[model_class].append(row[idx]) return models def iterate(self): if self._row: row = self._row.pop() else: row = self.cursor.fetchone() if not row: self._populated = True if not getattr(self.cursor, 'name', None): self.cursor.close() raise StopIteration elif not self._initialized: self.initialize(self.cursor.description) self._initialized = True def _get_pk(instance): if instance._meta.composite_key: return tuple([ instance._data[field_name] for field_name in instance._meta.primary_key.field_names]) return instance._get_pk_value() identity_map = {} _constructed = self.construct_instances(row) primary_instance = _constructed[self.model] for model_or_alias, instance in _constructed.items(): identity_map[model_or_alias] = OrderedDict() identity_map[model_or_alias][_get_pk(instance)] = instance model_data = self.read_model_data(row) while True: cur_row = self.cursor.fetchone() if cur_row is None: break duplicate_models = set() cur_row_data = self.read_model_data(cur_row) for model_class, data in cur_row_data.items(): if model_data[model_class] == data: duplicate_models.add(model_class) if not duplicate_models: self._row.append(cur_row) break different_models = self.all_models - duplicate_models new_instances = self.construct_instances(cur_row, different_models) for model_or_alias, instance in new_instances.items(): # Do not include any instances which are comprised solely of # NULL values. all_none = True for value in instance._data.values(): if value is not None: all_none = False if not all_none: identity_map[model_or_alias][_get_pk(instance)] = instance stack = [self.model] instances = [primary_instance] while stack: current = stack.pop() if current not in self.join_meta: continue for join in self.join_meta[current]: try: metadata, attr = self.source_to_dest[current][join.dest] except KeyError: continue if metadata.is_backref or metadata.is_self_join: for instance in identity_map[current].values(): setattr(instance, attr, []) if join.dest not in identity_map: continue for pk, inst in identity_map[join.dest].items(): if pk is None: continue try: # XXX: if no FK exists, unable to join. joined_inst = identity_map[current][ inst._data[metadata.foreign_key.name]] except KeyError: continue getattr(joined_inst, attr).append(inst) instances.append(inst) elif attr: if join.dest not in identity_map: continue for pk, instance in identity_map[current].items(): # XXX: if no FK exists, unable to join. joined_inst = identity_map[join.dest][ instance._data[metadata.foreign_key.name]] setattr( instance, metadata.foreign_key.name, joined_inst) instances.append(joined_inst) stack.append(join.dest) for instance in instances: instance._prepare_instance() return primary_instance class Query(Node): """Base class representing a database query on one or more tables.""" require_commit = True def __init__(self, model_class): super(Query, self).__init__() self.model_class = model_class self.database = model_class._meta.database self._dirty = True self._query_ctx = model_class self._joins = {self.model_class: []} # Join graph as adjacency list. self._where = None def __repr__(self): sql, params = self.sql() return '%s %s %s' % (self.model_class, sql, params) def clone(self): query = type(self)(self.model_class) query.database = self.database return self._clone_attributes(query) def _clone_attributes(self, query): if self._where is not None: query._where = self._where.clone() query._joins = self._clone_joins() query._query_ctx = self._query_ctx return query def _clone_joins(self): return dict( (mc, list(j)) for mc, j in self._joins.items()) def _add_query_clauses(self, initial, expressions, conjunction=None): reduced = reduce(operator.and_, expressions) if initial is None: return reduced conjunction = conjunction or operator.and_ return conjunction(initial, reduced) def _model_shorthand(self, args): accum = [] for arg in args: if isinstance(arg, Node): accum.append(arg) elif isinstance(arg, Query): accum.append(arg) elif isinstance(arg, ModelAlias): accum.extend(arg.get_proxy_fields()) elif isclass(arg) and issubclass(arg, Model): accum.extend(arg._meta.declared_fields) return accum @returns_clone def where(self, *expressions): self._where = self._add_query_clauses(self._where, expressions) @returns_clone def orwhere(self, *expressions): self._where = self._add_query_clauses( self._where, expressions, operator.or_) @returns_clone def join(self, dest, join_type=None, on=None): src = self._query_ctx if not on: require_join_condition = ( isinstance(dest, SelectQuery) or (isclass(dest) and not src._meta.rel_exists(dest))) if require_join_condition: raise ValueError('A join condition must be specified.') elif isinstance(on, basestring): on = src._meta.fields[on] self._joins.setdefault(src, []) self._joins[src].append(Join(src, dest, join_type, on)) if not isinstance(dest, SelectQuery): self._query_ctx = dest @returns_clone def switch(self, model_class=None): """Change or reset the query context.""" self._query_ctx = model_class or self.model_class def ensure_join(self, lm, rm, on=None, **join_kwargs): ctx = self._query_ctx for join in self._joins.get(lm, []): if join.dest == rm: return self return self.switch(lm).join(rm, on=on, **join_kwargs).switch(ctx) def convert_dict_to_node(self, qdict): accum = [] joins = [] relationship = (ForeignKeyField, ReverseRelationDescriptor) for key, value in sorted(qdict.items()): curr = self.model_class if '__' in key and key.rsplit('__', 1)[1] in DJANGO_MAP: key, op = key.rsplit('__', 1) op = DJANGO_MAP[op] else: op = OP.EQ for piece in key.split('__'): model_attr = getattr(curr, piece) if isinstance(model_attr, relationship): curr = model_attr.rel_model joins.append(model_attr) accum.append(Expression(model_attr, op, value)) return accum, joins def filter(self, *args, **kwargs): # normalize args and kwargs into a new expression dq_node = Node() if args: dq_node &= reduce(operator.and_, [a.clone() for a in args]) if kwargs: dq_node &= DQ(**kwargs) # dq_node should now be an Expression, lhs = Node(), rhs = ... q = deque([dq_node]) dq_joins = set() while q: curr = q.popleft() if not isinstance(curr, Expression): continue for side, piece in (('lhs', curr.lhs), ('rhs', curr.rhs)): if isinstance(piece, DQ): query, joins = self.convert_dict_to_node(piece.query) dq_joins.update(joins) expression = reduce(operator.and_, query) # Apply values from the DQ object. expression._negated = piece._negated expression._alias = piece._alias setattr(curr, side, expression) else: q.append(piece) dq_node = dq_node.rhs query = self.clone() for field in dq_joins: if isinstance(field, ForeignKeyField): lm, rm = field.model_class, field.rel_model field_obj = field elif isinstance(field, ReverseRelationDescriptor): lm, rm = field.field.rel_model, field.rel_model field_obj = field.field query = query.ensure_join(lm, rm, field_obj) return query.where(dq_node) def compiler(self): return self.database.compiler() def sql(self): raise NotImplementedError def _execute(self): sql, params = self.sql() return self.database.execute_sql(sql, params, self.require_commit) def execute(self): raise NotImplementedError def scalar(self, as_tuple=False, convert=False): if convert: row = self.tuples().first() else: row = self._execute().fetchone() if row and not as_tuple: return row[0] else: return row class RawQuery(Query): """ Execute a SQL query, returning a standard iterable interface that returns model instances. """ def __init__(self, model, query, *params): self._sql = query self._params = list(params) self._qr = None self._tuples = False self._dicts = False super(RawQuery, self).__init__(model) def clone(self): query = RawQuery(self.model_class, self._sql, *self._params) query._tuples = self._tuples query._dicts = self._dicts return query join = not_allowed('joining') where = not_allowed('where') switch = not_allowed('switch') @returns_clone def tuples(self, tuples=True): self._tuples = tuples @returns_clone def dicts(self, dicts=True): self._dicts = dicts def sql(self): return self._sql, self._params def execute(self): if self._qr is None: if self._tuples: QRW = self.database.get_result_wrapper(RESULTS_TUPLES) elif self._dicts: QRW = self.database.get_result_wrapper(RESULTS_DICTS) else: QRW = self.database.get_result_wrapper(RESULTS_NAIVE) self._qr = QRW(self.model_class, self._execute(), None) return self._qr def __iter__(self): return iter(self.execute()) def allow_extend(orig, new_val, **kwargs): extend = kwargs.pop('extend', False) if kwargs: raise ValueError('"extend" is the only valid keyword argument.') if extend: return ((orig or []) + new_val) or None elif new_val: return new_val class SelectQuery(Query): _node_type = 'select_query' def __init__(self, model_class, *selection): super(SelectQuery, self).__init__(model_class) self.require_commit = self.database.commit_select self.__select(*selection) self._from = None self._group_by = None self._having = None self._order_by = None self._windows = None self._limit = None self._offset = None self._distinct = False self._for_update = (False, False) self._naive = False self._tuples = False self._dicts = False self._aggregate_rows = False self._alias = None self._qr = None def _clone_attributes(self, query): query = super(SelectQuery, self)._clone_attributes(query) query._explicit_selection = self._explicit_selection query._select = list(self._select) if self._from is not None: query._from = [] for f in self._from: if isinstance(f, Node): query._from.append(f.clone()) else: query._from.append(f) if self._group_by is not None: query._group_by = list(self._group_by) if self._having: query._having = self._having.clone() if self._order_by is not None: query._order_by = list(self._order_by) if self._windows is not None: query._windows = list(self._windows) query._limit = self._limit query._offset = self._offset query._distinct = self._distinct query._for_update = self._for_update query._naive = self._naive query._tuples = self._tuples query._dicts = self._dicts query._aggregate_rows = self._aggregate_rows query._alias = self._alias return query def compound_op(operator): def inner(self, other): supported_ops = self.model_class._meta.database.compound_operations if operator not in supported_ops: raise ValueError( 'Your database does not support %s' % operator) return CompoundSelect(self.model_class, self, operator, other) return inner _compound_op_static = staticmethod(compound_op) __or__ = compound_op('UNION') __and__ = compound_op('INTERSECT') __sub__ = compound_op('EXCEPT') def __xor__(self, rhs): # Symmetric difference, should just be (self | rhs) - (self & rhs)... wrapped_rhs = self.model_class.select(SQL('*')).from_( EnclosedClause((self & rhs)).alias('_')).order_by() return (self | rhs) - wrapped_rhs def union_all(self, rhs): return SelectQuery._compound_op_static('UNION ALL')(self, rhs) def __select(self, *selection): self._explicit_selection = len(selection) > 0 selection = selection or self.model_class._meta.declared_fields self._select = self._model_shorthand(selection) select = returns_clone(__select) @returns_clone def from_(self, *args): self._from = list(args) if args else None @returns_clone def group_by(self, *args, **kwargs): self._group_by = self._model_shorthand(args) if args else None @returns_clone def having(self, *expressions): self._having = self._add_query_clauses(self._having, expressions) @returns_clone def order_by(self, *args, **kwargs): self._order_by = allow_extend(self._order_by, list(args), **kwargs) @returns_clone def window(self, *windows, **kwargs): self._windows = allow_extend(self._windows, list(windows), **kwargs) @returns_clone def limit(self, lim): self._limit = lim @returns_clone def offset(self, off): self._offset = off @returns_clone def paginate(self, page, paginate_by=20): if page > 0: page -= 1 self._limit = paginate_by self._offset = page * paginate_by @returns_clone def distinct(self, is_distinct=True): self._distinct = is_distinct @returns_clone def for_update(self, for_update=True, nowait=False): self._for_update = (for_update, nowait) @returns_clone def naive(self, naive=True): self._naive = naive @returns_clone def tuples(self, tuples=True): self._tuples = tuples @returns_clone def dicts(self, dicts=True): self._dicts = dicts @returns_clone def aggregate_rows(self, aggregate_rows=True): self._aggregate_rows = aggregate_rows @returns_clone def alias(self, alias=None): self._alias = alias def annotate(self, rel_model, annotation=None): if annotation is None: annotation = fn.Count(rel_model._meta.primary_key).alias('count') if self._query_ctx == rel_model: query = self.switch(self.model_class) else: query = self.clone() query = query.ensure_join(query._query_ctx, rel_model) if not query._group_by: query._group_by = [x.alias() for x in query._select] query._select = tuple(query._select) + (annotation,) return query def _aggregate(self, aggregation=None): if aggregation is None: aggregation = fn.Count(SQL('*')) query = self.order_by() query._select = [aggregation] return query def aggregate(self, aggregation=None, convert=True): return self._aggregate(aggregation).scalar(convert=convert) def count(self, clear_limit=False): if self._distinct or self._group_by or self._limit or self._offset: return self.wrapped_count(clear_limit=clear_limit) # defaults to a count() of the primary key return self.aggregate(convert=False) or 0 def wrapped_count(self, clear_limit=False): clone = self.order_by() if clear_limit: clone._limit = clone._offset = None sql, params = clone.sql() wrapped = 'SELECT COUNT(1) FROM (%s) AS wrapped_select' % sql rq = self.model_class.raw(wrapped, *params) return rq.scalar() or 0 def exists(self): clone = self.paginate(1, 1) clone._select = [SQL('1')] return bool(clone.scalar()) def get(self): clone = self.paginate(1, 1) try: return next(clone.execute()) except StopIteration: raise self.model_class.DoesNotExist( 'Instance matching query does not exist:\nSQL: %s\nPARAMS: %s' % self.sql()) def peek(self, n=1): res = self.execute() res.fill_cache(n) models = res._result_cache[:n] if models: return models[0] if n == 1 else models def first(self, n=1): if self._limit != n: self._limit = n self._dirty = True return self.peek(n=n) def sql(self): return self.compiler().generate_select(self) def verify_naive(self): model_class = self.model_class for node in self._select: if isinstance(node, Field) and node.model_class != model_class: return False elif isinstance(node, Node) and node._bind_to is not None: if node._bind_to != model_class: return False return True def get_query_meta(self): return (self._select, self._joins) def _get_result_wrapper(self): if self._tuples: return self.database.get_result_wrapper(RESULTS_TUPLES) elif self._dicts: return self.database.get_result_wrapper(RESULTS_DICTS) elif self._naive or not self._joins or self.verify_naive(): return self.database.get_result_wrapper(RESULTS_NAIVE) elif self._aggregate_rows: return self.database.get_result_wrapper(RESULTS_AGGREGATE_MODELS) else: return self.database.get_result_wrapper(RESULTS_MODELS) def execute(self): if self._dirty or self._qr is None: model_class = self.model_class query_meta = self.get_query_meta() ResultWrapper = self._get_result_wrapper() self._qr = ResultWrapper(model_class, self._execute(), query_meta) self._dirty = False return self._qr else: return self._qr def __iter__(self): return iter(self.execute()) def iterator(self): return iter(self.execute().iterator()) def __getitem__(self, value): res = self.execute() if isinstance(value, slice): index = value.stop else: index = value if index is not None: index = index + 1 if index >= 0 else None res.fill_cache(index) return res._result_cache[value] def __len__(self): return len(self.execute()) if PY3: def __hash__(self): return id(self) class NoopSelectQuery(SelectQuery): def sql(self): return (self.database.get_noop_sql(), ()) def get_query_meta(self): return None, None def _get_result_wrapper(self): return self.database.get_result_wrapper(RESULTS_TUPLES) class CompoundSelect(SelectQuery): _node_type = 'compound_select_query' def __init__(self, model_class, lhs=None, operator=None, rhs=None): self.lhs = lhs self.operator = operator self.rhs = rhs super(CompoundSelect, self).__init__(model_class, []) def _clone_attributes(self, query): query = super(CompoundSelect, self)._clone_attributes(query) query.lhs = self.lhs query.operator = self.operator query.rhs = self.rhs return query def count(self, clear_limit=False): return self.wrapped_count(clear_limit=clear_limit) def get_query_meta(self): return self.lhs.get_query_meta() def verify_naive(self): return self.lhs.verify_naive() and self.rhs.verify_naive() def _get_result_wrapper(self): if self._tuples: return self.database.get_result_wrapper(RESULTS_TUPLES) elif self._dicts: return self.database.get_result_wrapper(RESULTS_DICTS) elif self._aggregate_rows: return self.database.get_result_wrapper(RESULTS_AGGREGATE_MODELS) has_joins = self.lhs._joins or self.rhs._joins is_naive = self.lhs._naive or self.rhs._naive or self._naive if is_naive or not has_joins or self.verify_naive(): return self.database.get_result_wrapper(RESULTS_NAIVE) else: return self.database.get_result_wrapper(RESULTS_MODELS) class _WriteQuery(Query): def __init__(self, model_class): self._returning = None self._tuples = False self._dicts = False self._qr = None super(_WriteQuery, self).__init__(model_class) def _clone_attributes(self, query): query = super(_WriteQuery, self)._clone_attributes(query) if self._returning: query._returning = list(self._returning) query._tuples = self._tuples query._dicts = self._dicts return query def requires_returning(method): def inner(self, *args, **kwargs): db = self.model_class._meta.database if not db.returning_clause: raise ValueError('RETURNING is not supported by your ' 'database: %s' % type(db)) return method(self, *args, **kwargs) return inner @requires_returning @returns_clone def returning(self, *selection): if len(selection) == 1 and selection[0] is None: self._returning = None else: if not selection: selection = self.model_class._meta.declared_fields self._returning = self._model_shorthand(selection) @requires_returning @returns_clone def tuples(self, tuples=True): self._tuples = tuples @requires_returning @returns_clone def dicts(self, dicts=True): self._dicts = dicts def get_result_wrapper(self): if self._returning is not None: if self._tuples: return self.database.get_result_wrapper(RESULTS_TUPLES) elif self._dicts: return self.database.get_result_wrapper(RESULTS_DICTS) return self.database.get_result_wrapper(RESULTS_NAIVE) def _execute_with_result_wrapper(self): ResultWrapper = self.get_result_wrapper() meta = (self._returning, {self.model_class: []}) self._qr = ResultWrapper(self.model_class, self._execute(), meta) return self._qr class UpdateQuery(_WriteQuery): def __init__(self, model_class, update=None): self._update = update self._on_conflict = None super(UpdateQuery, self).__init__(model_class) def _clone_attributes(self, query): query = super(UpdateQuery, self)._clone_attributes(query) query._update = dict(self._update) query._on_conflict = self._on_conflict return query @returns_clone def on_conflict(self, action=None): self._on_conflict = action join = not_allowed('joining') def sql(self): return self.compiler().generate_update(self) def execute(self): if self._returning is not None and self._qr is None: return self._execute_with_result_wrapper() elif self._qr is not None: return self._qr else: return self.database.rows_affected(self._execute()) def __iter__(self): if not self.model_class._meta.database.returning_clause: raise ValueError('UPDATE queries cannot be iterated over unless ' 'they specify a RETURNING clause, which is not ' 'supported by your database.') return iter(self.execute()) def iterator(self): return iter(self.execute().iterator()) class InsertQuery(_WriteQuery): def __init__(self, model_class, field_dict=None, rows=None, fields=None, query=None, validate_fields=False): super(InsertQuery, self).__init__(model_class) self._upsert = False self._is_multi_row_insert = rows is not None or query is not None self._return_id_list = False if rows is not None: self._rows = rows else: self._rows = [field_dict or {}] self._fields = fields self._query = query self._validate_fields = validate_fields self._on_conflict = None def _iter_rows(self): model_meta = self.model_class._meta if self._validate_fields: valid_fields = model_meta.valid_fields def validate_field(field): if field not in valid_fields: raise KeyError('"%s" is not a recognized field.' % field) defaults = model_meta._default_dict callables = model_meta._default_callables for row_dict in self._rows: field_row = defaults.copy() seen = set() for key in row_dict: if self._validate_fields: validate_field(key) if key in model_meta.fields: field = model_meta.fields[key] else: field = key field_row[field] = row_dict[key] seen.add(field) if callables: for field in callables: if field not in seen: field_row[field] = callables[field]() yield field_row def _clone_attributes(self, query): query = super(InsertQuery, self)._clone_attributes(query) query._rows = self._rows query._upsert = self._upsert query._is_multi_row_insert = self._is_multi_row_insert query._fields = self._fields query._query = self._query query._return_id_list = self._return_id_list query._validate_fields = self._validate_fields query._on_conflict = self._on_conflict return query join = not_allowed('joining') where = not_allowed('where clause') @returns_clone def upsert(self, upsert=True): self._upsert = upsert @returns_clone def on_conflict(self, action=None): self._on_conflict = action @returns_clone def return_id_list(self, return_id_list=True): self._return_id_list = return_id_list @property def is_insert_returning(self): if self.database.insert_returning: if not self._is_multi_row_insert or self._return_id_list: return True return False def sql(self): return self.compiler().generate_insert(self) def _insert_with_loop(self): id_list = [] last_id = None return_id_list = self._return_id_list for row in self._rows: last_id = (InsertQuery(self.model_class, row) .upsert(self._upsert) .execute()) if return_id_list: id_list.append(last_id) if return_id_list: return id_list else: return last_id def execute(self): insert_with_loop = ( self._is_multi_row_insert and self._query is None and self._returning is None and not self.database.insert_many) if insert_with_loop: return self._insert_with_loop() if self._returning is not None and self._qr is None: return self._execute_with_result_wrapper() elif self._qr is not None: return self._qr else: cursor = self._execute() if not self._is_multi_row_insert: if self.database.insert_returning: pk_row = cursor.fetchone() meta = self.model_class._meta clean_data = [ field.python_value(column) for field, column in zip(meta.get_primary_key_fields(), pk_row)] if self.model_class._meta.composite_key: return clean_data return clean_data[0] return self.database.last_insert_id(cursor, self.model_class) elif self._return_id_list: return map(operator.itemgetter(0), cursor.fetchall()) else: return True class DeleteQuery(_WriteQuery): join = not_allowed('joining') def sql(self): return self.compiler().generate_delete(self) def execute(self): if self._returning is not None and self._qr is None: return self._execute_with_result_wrapper() elif self._qr is not None: return self._qr else: return self.database.rows_affected(self._execute()) IndexMetadata = namedtuple( 'IndexMetadata', ('name', 'sql', 'columns', 'unique', 'table')) ColumnMetadata = namedtuple( 'ColumnMetadata', ('name', 'data_type', 'null', 'primary_key', 'table')) ForeignKeyMetadata = namedtuple( 'ForeignKeyMetadata', ('column', 'dest_table', 'dest_column', 'table')) class PeeweeException(Exception): pass class ImproperlyConfigured(PeeweeException): pass class DatabaseError(PeeweeException): pass class DataError(DatabaseError): pass class IntegrityError(DatabaseError): pass class InterfaceError(PeeweeException): pass class InternalError(DatabaseError): pass class NotSupportedError(DatabaseError): pass class OperationalError(DatabaseError): pass class ProgrammingError(DatabaseError): pass class ExceptionWrapper(object): __slots__ = ['exceptions'] def __init__(self, exceptions): self.exceptions = exceptions def __enter__(self): pass def __exit__(self, exc_type, exc_value, traceback): if exc_type is None: return if exc_type.__name__ in self.exceptions: new_type = self.exceptions[exc_type.__name__] if PY26: exc_args = exc_value else: exc_args = exc_value.args reraise(new_type, new_type(*exc_args), traceback) class _BaseConnectionLocal(object): def __init__(self, **kwargs): super(_BaseConnectionLocal, self).__init__(**kwargs) self.autocommit = None self.closed = True self.conn = None self.context_stack = [] self.transactions = [] class _ConnectionLocal(_BaseConnectionLocal, threading.local): pass class Database(object): commit_select = False compiler_class = QueryCompiler compound_operations = ['UNION', 'INTERSECT', 'EXCEPT', 'UNION ALL'] compound_select_parentheses = False distinct_on = False drop_cascade = False field_overrides = {} foreign_keys = True for_update = False for_update_nowait = False insert_many = True insert_returning = False interpolation = '?' limit_max = None op_overrides = {} quote_char = '"' reserved_tables = [] returning_clause = False savepoints = True sequences = False subquery_delete_same_table = True upsert_sql = None window_functions = False exceptions = { 'ConstraintError': IntegrityError, 'DatabaseError': DatabaseError, 'DataError': DataError, 'IntegrityError': IntegrityError, 'InterfaceError': InterfaceError, 'InternalError': InternalError, 'NotSupportedError': NotSupportedError, 'OperationalError': OperationalError, 'ProgrammingError': ProgrammingError} def __init__(self, database, threadlocals=True, autocommit=True, fields=None, ops=None, autorollback=False, use_speedups=True, **connect_kwargs): self.connect_kwargs = {} if threadlocals: self._local = _ConnectionLocal() else: self._local = _BaseConnectionLocal() self.init(database, **connect_kwargs) self._conn_lock = threading.Lock() self.autocommit = autocommit self.autorollback = autorollback self.use_speedups = use_speedups self.field_overrides = merge_dict(self.field_overrides, fields or {}) self.op_overrides = merge_dict(self.op_overrides, ops or {}) def init(self, database, **connect_kwargs): if not self.is_closed(): self.close() self.deferred = database is None self.database = database self.connect_kwargs.update(connect_kwargs) def exception_wrapper(self): return ExceptionWrapper(self.exceptions) def connect(self): with self._conn_lock: if self.deferred: raise Exception('Error, database not properly initialized ' 'before opening connection') with self.exception_wrapper(): self._local.conn = self._connect( self.database, **self.connect_kwargs) self._local.closed = False self.initialize_connection(self._local.conn) def initialize_connection(self, conn): pass def close(self): with self._conn_lock: if self.deferred: raise Exception('Error, database not properly initialized ' 'before closing connection') with self.exception_wrapper(): self._close(self._local.conn) self._local.closed = True def get_conn(self): if self._local.context_stack: conn = self._local.context_stack[-1].connection if conn is not None: return conn if self._local.closed: self.connect() return self._local.conn def is_closed(self): return self._local.closed def get_cursor(self): return self.get_conn().cursor() def _close(self, conn): conn.close() def _connect(self, database, **kwargs): raise NotImplementedError @classmethod def register_fields(cls, fields): cls.field_overrides = merge_dict(cls.field_overrides, fields) @classmethod def register_ops(cls, ops): cls.op_overrides = merge_dict(cls.op_overrides, ops) def get_result_wrapper(self, wrapper_type): if wrapper_type == RESULTS_NAIVE: return (_ModelQueryResultWrapper if self.use_speedups else NaiveQueryResultWrapper) elif wrapper_type == RESULTS_MODELS: return ModelQueryResultWrapper elif wrapper_type == RESULTS_TUPLES: return (_TuplesQueryResultWrapper if self.use_speedups else TuplesQueryResultWrapper) elif wrapper_type == RESULTS_DICTS: return (_DictQueryResultWrapper if self.use_speedups else DictQueryResultWrapper) elif wrapper_type == RESULTS_AGGREGATE_MODELS: return AggregateQueryResultWrapper else: return (_ModelQueryResultWrapper if self.use_speedups else NaiveQueryResultWrapper) def last_insert_id(self, cursor, model): if model._meta.auto_increment: return cursor.lastrowid def rows_affected(self, cursor): return cursor.rowcount def compiler(self): return self.compiler_class( self.quote_char, self.interpolation, self.field_overrides, self.op_overrides) def execute(self, clause): return self.execute_sql(*self.compiler().parse_node(clause)) def execute_sql(self, sql, params=None, require_commit=True): logger.debug((sql, params)) with self.exception_wrapper(): cursor = self.get_cursor() try: cursor.execute(sql, params or ()) except Exception: if self.get_autocommit() and self.autorollback: self.rollback() raise else: if require_commit and self.get_autocommit(): self.commit() return cursor def begin(self): pass def commit(self): self.get_conn().commit() def rollback(self): self.get_conn().rollback() def set_autocommit(self, autocommit): self._local.autocommit = autocommit def get_autocommit(self): if self._local.autocommit is None: self.set_autocommit(self.autocommit) return self._local.autocommit def push_execution_context(self, transaction): self._local.context_stack.append(transaction) def pop_execution_context(self): self._local.context_stack.pop() def execution_context_depth(self): return len(self._local.context_stack) def execution_context(self, with_transaction=True): return ExecutionContext(self, with_transaction=with_transaction) def push_transaction(self, transaction): self._local.transactions.append(transaction) def pop_transaction(self): self._local.transactions.pop() def transaction_depth(self): return len(self._local.transactions) def transaction(self): return transaction(self) def commit_on_success(self, func): @wraps(func) def inner(*args, **kwargs): with self.transaction(): return func(*args, **kwargs) return inner def savepoint(self, sid=None): if not self.savepoints: raise NotImplementedError return savepoint(self, sid) def atomic(self): return _atomic(self) def get_tables(self, schema=None): raise NotImplementedError def get_indexes(self, table, schema=None): raise NotImplementedError def get_columns(self, table, schema=None): raise NotImplementedError def get_primary_keys(self, table, schema=None): raise NotImplementedError def get_foreign_keys(self, table, schema=None): raise NotImplementedError def sequence_exists(self, seq): raise NotImplementedError def create_table(self, model_class, safe=False): qc = self.compiler() return self.execute_sql(*qc.create_table(model_class, safe)) def create_tables(self, models, safe=False): create_model_tables(models, fail_silently=safe) def create_index(self, model_class, fields, unique=False): qc = self.compiler() if not isinstance(fields, (list, tuple)): raise ValueError('Fields passed to "create_index" must be a list ' 'or tuple: "%s"' % fields) fobjs = [ model_class._meta.fields[f] if isinstance(f, basestring) else f for f in fields] return self.execute_sql(*qc.create_index(model_class, fobjs, unique)) def drop_index(self, model_class, fields, safe=False): qc = self.compiler() if not isinstance(fields, (list, tuple)): raise ValueError('Fields passed to "drop_index" must be a list ' 'or tuple: "%s"' % fields) fobjs = [ model_class._meta.fields[f] if isinstance(f, basestring) else f for f in fields] return self.execute_sql(*qc.drop_index(model_class, fobjs, safe)) def create_foreign_key(self, model_class, field, constraint=None): qc = self.compiler() return self.execute_sql(*qc.create_foreign_key( model_class, field, constraint)) def create_sequence(self, seq): if self.sequences: qc = self.compiler() return self.execute_sql(*qc.create_sequence(seq)) def drop_table(self, model_class, fail_silently=False, cascade=False): qc = self.compiler() return self.execute_sql(*qc.drop_table( model_class, fail_silently, cascade)) def drop_tables(self, models, safe=False, cascade=False): drop_model_tables(models, fail_silently=safe, cascade=cascade) def truncate_table(self, model_class, restart_identity=False, cascade=False): qc = self.compiler() return self.execute_sql(*qc.truncate_table( model_class, restart_identity, cascade)) def truncate_tables(self, models, restart_identity=False, cascade=False): for model in reversed(sort_models_topologically(models)): model.truncate_table(restart_identity, cascade) def drop_sequence(self, seq): if self.sequences: qc = self.compiler() return self.execute_sql(*qc.drop_sequence(seq)) def extract_date(self, date_part, date_field): return fn.EXTRACT(Clause(date_part, R('FROM'), date_field)) def truncate_date(self, date_part, date_field): return fn.DATE_TRUNC(date_part, date_field) def default_insert_clause(self, model_class): return SQL('DEFAULT VALUES') def get_noop_sql(self): return 'SELECT 0 WHERE 0' def get_binary_type(self): return binary_construct class SqliteDatabase(Database): compiler_class = SqliteQueryCompiler field_overrides = { 'bool': 'INTEGER', 'smallint': 'INTEGER', 'uuid': 'TEXT', } foreign_keys = False insert_many = sqlite3 and sqlite3.sqlite_version_info >= (3, 7, 11, 0) limit_max = -1 op_overrides = { OP.LIKE: 'GLOB', OP.ILIKE: 'LIKE', } upsert_sql = 'INSERT OR REPLACE INTO' def __init__(self, database, pragmas=None, *args, **kwargs): self._pragmas = pragmas or [] journal_mode = kwargs.pop('journal_mode', None) # Backwards-compat. if journal_mode: self._pragmas.append(('journal_mode', journal_mode)) super(SqliteDatabase, self).__init__(database, *args, **kwargs) def _connect(self, database, **kwargs): if not sqlite3: raise ImproperlyConfigured('pysqlite or sqlite3 must be installed.') conn = sqlite3.connect(database, **kwargs) conn.isolation_level = None try: self._add_conn_hooks(conn) except: conn.close() raise return conn def _add_conn_hooks(self, conn): self._set_pragmas(conn) conn.create_function('date_part', 2, _sqlite_date_part) conn.create_function('date_trunc', 2, _sqlite_date_trunc) conn.create_function('regexp', 2, _sqlite_regexp) def _set_pragmas(self, conn): if self._pragmas: cursor = conn.cursor() for pragma, value in self._pragmas: cursor.execute('PRAGMA %s = %s;' % (pragma, value)) cursor.close() def begin(self, lock_type='DEFERRED'): self.execute_sql('BEGIN %s' % lock_type, require_commit=False) def create_foreign_key(self, model_class, field, constraint=None): raise OperationalError('SQLite does not support ALTER TABLE ' 'statements to add constraints.') def get_tables(self, schema=None): cursor = self.execute_sql('SELECT name FROM sqlite_master WHERE ' 'type = ? ORDER BY name;', ('table',)) return [row[0] for row in cursor.fetchall()] def get_indexes(self, table, schema=None): query = ('SELECT name, sql FROM sqlite_master ' 'WHERE tbl_name = ? AND type = ? ORDER BY name') cursor = self.execute_sql(query, (table, 'index')) index_to_sql = dict(cursor.fetchall()) # Determine which indexes have a unique constraint. unique_indexes = set() cursor = self.execute_sql('PRAGMA index_list("%s")' % table) for row in cursor.fetchall(): name = row[1] is_unique = int(row[2]) == 1 if is_unique: unique_indexes.add(name) # Retrieve the indexed columns. index_columns = {} for index_name in sorted(index_to_sql): cursor = self.execute_sql('PRAGMA index_info("%s")' % index_name) index_columns[index_name] = [row[2] for row in cursor.fetchall()] return [ IndexMetadata( name, index_to_sql[name], index_columns[name], name in unique_indexes, table) for name in sorted(index_to_sql)] def get_columns(self, table, schema=None): cursor = self.execute_sql('PRAGMA table_info("%s")' % table) return [ColumnMetadata(row[1], row[2], not row[3], bool(row[5]), table) for row in cursor.fetchall()] def get_primary_keys(self, table, schema=None): cursor = self.execute_sql('PRAGMA table_info("%s")' % table) return [row[1] for row in cursor.fetchall() if row[-1]] def get_foreign_keys(self, table, schema=None): cursor = self.execute_sql('PRAGMA foreign_key_list("%s")' % table) return [ForeignKeyMetadata(row[3], row[2], row[4], table) for row in cursor.fetchall()] def savepoint(self, sid=None): return savepoint_sqlite(self, sid) def extract_date(self, date_part, date_field): return fn.date_part(date_part, date_field) def truncate_date(self, date_part, date_field): return fn.strftime(SQLITE_DATE_TRUNC_MAPPING[date_part], date_field) def get_binary_type(self): return sqlite3.Binary class PostgresqlDatabase(Database): commit_select = True compound_select_parentheses = True distinct_on = True drop_cascade = True field_overrides = { 'blob': 'BYTEA', 'bool': 'BOOLEAN', 'datetime': 'TIMESTAMP', 'decimal': 'NUMERIC', 'double': 'DOUBLE PRECISION', 'primary_key': 'SERIAL', 'uuid': 'UUID', } for_update = True for_update_nowait = True insert_returning = True interpolation = '%s' op_overrides = { OP.REGEXP: '~', } reserved_tables = ['user'] returning_clause = True sequences = True window_functions = True register_unicode = True def _connect(self, database, encoding=None, **kwargs): if not psycopg2: raise ImproperlyConfigured('psycopg2 must be installed.') conn = psycopg2.connect(database=database, **kwargs) if self.register_unicode: pg_extensions.register_type(pg_extensions.UNICODE, conn) pg_extensions.register_type(pg_extensions.UNICODEARRAY, conn) if encoding: conn.set_client_encoding(encoding) return conn def _get_pk_sequence(self, model): meta = model._meta if meta.primary_key is not False and meta.primary_key.sequence: return meta.primary_key.sequence elif meta.auto_increment: return '%s_%s_seq' % (meta.db_table, meta.primary_key.db_column) def last_insert_id(self, cursor, model): sequence = self._get_pk_sequence(model) if not sequence: return meta = model._meta if meta.schema: schema = '%s.' % meta.schema else: schema = '' cursor.execute("SELECT CURRVAL('%s\"%s\"')" % (schema, sequence)) result = cursor.fetchone()[0] if self.get_autocommit(): self.commit() return result def get_tables(self, schema='public'): query = ('SELECT tablename FROM pg_catalog.pg_tables ' 'WHERE schemaname = %s ORDER BY tablename') return [r for r, in self.execute_sql(query, (schema,)).fetchall()] def get_indexes(self, table, schema='public'): query = """ SELECT i.relname, idxs.indexdef, idx.indisunique, array_to_string(array_agg(cols.attname), ',') FROM pg_catalog.pg_class AS t INNER JOIN pg_catalog.pg_index AS idx ON t.oid = idx.indrelid INNER JOIN pg_catalog.pg_class AS i ON idx.indexrelid = i.oid INNER JOIN pg_catalog.pg_indexes AS idxs ON (idxs.tablename = t.relname AND idxs.indexname = i.relname) LEFT OUTER JOIN pg_catalog.pg_attribute AS cols ON (cols.attrelid = t.oid AND cols.attnum = ANY(idx.indkey)) WHERE t.relname = %s AND t.relkind = %s AND idxs.schemaname = %s GROUP BY i.relname, idxs.indexdef, idx.indisunique ORDER BY idx.indisunique DESC, i.relname;""" cursor = self.execute_sql(query, (table, 'r', schema)) return [IndexMetadata(row[0], row[1], row[3].split(','), row[2], table) for row in cursor.fetchall()] def get_columns(self, table, schema='public'): query = """ SELECT column_name, is_nullable, data_type FROM information_schema.columns WHERE table_name = %s AND table_schema = %s ORDER BY ordinal_position""" cursor = self.execute_sql(query, (table, schema)) pks = set(self.get_primary_keys(table, schema)) return [ColumnMetadata(name, dt, null == 'YES', name in pks, table) for name, null, dt in cursor.fetchall()] def get_primary_keys(self, table, schema='public'): query = """ SELECT kc.column_name FROM information_schema.table_constraints AS tc INNER JOIN information_schema.key_column_usage AS kc ON ( tc.table_name = kc.table_name AND tc.table_schema = kc.table_schema AND tc.constraint_name = kc.constraint_name) WHERE tc.constraint_type = %s AND tc.table_name = %s AND tc.table_schema = %s""" cursor = self.execute_sql(query, ('PRIMARY KEY', table, schema)) return [row for row, in cursor.fetchall()] def get_foreign_keys(self, table, schema='public'): sql = """ SELECT kcu.column_name, ccu.table_name, ccu.column_name FROM information_schema.table_constraints AS tc JOIN information_schema.key_column_usage AS kcu ON (tc.constraint_name = kcu.constraint_name AND tc.constraint_schema = kcu.constraint_schema) JOIN information_schema.constraint_column_usage AS ccu ON (ccu.constraint_name = tc.constraint_name AND ccu.constraint_schema = tc.constraint_schema) WHERE tc.constraint_type = 'FOREIGN KEY' AND tc.table_name = %s AND tc.table_schema = %s""" cursor = self.execute_sql(sql, (table, schema)) return [ForeignKeyMetadata(row[0], row[1], row[2], table) for row in cursor.fetchall()] def sequence_exists(self, sequence): res = self.execute_sql(""" SELECT COUNT(*) FROM pg_class, pg_namespace WHERE relkind='S' AND pg_class.relnamespace = pg_namespace.oid AND relname=%s""", (sequence,)) return bool(res.fetchone()[0]) def set_search_path(self, *search_path): path_params = ','.join(['%s'] * len(search_path)) self.execute_sql('SET search_path TO %s' % path_params, search_path) def get_noop_sql(self): return 'SELECT 0 WHERE false' def get_binary_type(self): return psycopg2.Binary class MySQLDatabase(Database): commit_select = True compound_operations = ['UNION', 'UNION ALL'] field_overrides = { 'bool': 'BOOL', 'decimal': 'NUMERIC', 'double': 'DOUBLE PRECISION', 'float': 'FLOAT', 'primary_key': 'INTEGER AUTO_INCREMENT', 'text': 'LONGTEXT', 'uuid': 'VARCHAR(40)', } for_update = True interpolation = '%s' limit_max = 2 ** 64 - 1 # MySQL quirk op_overrides = { OP.LIKE: 'LIKE BINARY', OP.ILIKE: 'LIKE', OP.XOR: 'XOR', } quote_char = '`' subquery_delete_same_table = False upsert_sql = 'REPLACE INTO' def _connect(self, database, **kwargs): if not mysql: raise ImproperlyConfigured('MySQLdb or PyMySQL must be installed.') conn_kwargs = { 'charset': 'utf8', 'use_unicode': True, } conn_kwargs.update(kwargs) if 'password' in conn_kwargs: conn_kwargs['passwd'] = conn_kwargs.pop('password') return mysql.connect(db=database, **conn_kwargs) def get_tables(self, schema=None): return [row for row, in self.execute_sql('SHOW TABLES')] def get_indexes(self, table, schema=None): cursor = self.execute_sql('SHOW INDEX FROM `%s`' % table) unique = set() indexes = {} for row in cursor.fetchall(): if not row[1]: unique.add(row[2]) indexes.setdefault(row[2], []) indexes[row[2]].append(row[4]) return [IndexMetadata(name, None, indexes[name], name in unique, table) for name in indexes] def get_columns(self, table, schema=None): sql = """ SELECT column_name, is_nullable, data_type FROM information_schema.columns WHERE table_name = %s AND table_schema = DATABASE()""" cursor = self.execute_sql(sql, (table,)) pks = set(self.get_primary_keys(table)) return [ColumnMetadata(name, dt, null == 'YES', name in pks, table) for name, null, dt in cursor.fetchall()] def get_primary_keys(self, table, schema=None): cursor = self.execute_sql('SHOW INDEX FROM `%s`' % table) return [row[4] for row in cursor.fetchall() if row[2] == 'PRIMARY'] def get_foreign_keys(self, table, schema=None): query = """ SELECT column_name, referenced_table_name, referenced_column_name FROM information_schema.key_column_usage WHERE table_name = %s AND table_schema = DATABASE() AND referenced_table_name IS NOT NULL AND referenced_column_name IS NOT NULL""" cursor = self.execute_sql(query, (table,)) return [ ForeignKeyMetadata(column, dest_table, dest_column, table) for column, dest_table, dest_column in cursor.fetchall()] def extract_date(self, date_part, date_field): return fn.EXTRACT(Clause(R(date_part), R('FROM'), date_field)) def truncate_date(self, date_part, date_field): return fn.DATE_FORMAT(date_field, MYSQL_DATE_TRUNC_MAPPING[date_part]) def default_insert_clause(self, model_class): return Clause( EnclosedClause(model_class._meta.primary_key), SQL('VALUES (DEFAULT)')) def get_noop_sql(self): return 'DO 0' def get_binary_type(self): return mysql.Binary class _callable_context_manager(object): def __call__(self, fn): @wraps(fn) def inner(*args, **kwargs): with self: return fn(*args, **kwargs) return inner class ExecutionContext(_callable_context_manager): def __init__(self, database, with_transaction=True): self.database = database self.with_transaction = with_transaction self.connection = None def __enter__(self): with self.database._conn_lock: self.database.push_execution_context(self) self.connection = self.database._connect( self.database.database, **self.database.connect_kwargs) if self.with_transaction: self.txn = self.database.transaction() self.txn.__enter__() return self def __exit__(self, exc_type, exc_val, exc_tb): with self.database._conn_lock: if self.connection is None: self.database.pop_execution_context() else: try: if self.with_transaction: if not exc_type: self.txn.commit(False) self.txn.__exit__(exc_type, exc_val, exc_tb) finally: self.database.pop_execution_context() self.database._close(self.connection) class Using(ExecutionContext): def __init__(self, database, models, with_transaction=True): super(Using, self).__init__(database, with_transaction) self.models = models def __enter__(self): self._orig = [] for model in self.models: self._orig.append(model._meta.database) model._meta.database = self.database return super(Using, self).__enter__() def __exit__(self, exc_type, exc_val, exc_tb): super(Using, self).__exit__(exc_type, exc_val, exc_tb) for i, model in enumerate(self.models): model._meta.database = self._orig[i] class _atomic(_callable_context_manager): def __init__(self, db): self.db = db def __enter__(self): if self.db.transaction_depth() == 0: self._helper = self.db.transaction() else: self._helper = self.db.savepoint() return self._helper.__enter__() def __exit__(self, exc_type, exc_val, exc_tb): return self._helper.__exit__(exc_type, exc_val, exc_tb) class transaction(_callable_context_manager): def __init__(self, db): self.db = db def _begin(self): self.db.begin() def commit(self, begin=True): self.db.commit() if begin: self._begin() def rollback(self, begin=True): self.db.rollback() if begin: self._begin() def __enter__(self): self._orig = self.db.get_autocommit() self.db.set_autocommit(False) if self.db.transaction_depth() == 0: self._begin() self.db.push_transaction(self) return self def __exit__(self, exc_type, exc_val, exc_tb): try: if exc_type: self.rollback(False) elif self.db.transaction_depth() == 1: try: self.commit(False) except: self.rollback(False) raise finally: self.db.set_autocommit(self._orig) self.db.pop_transaction() class savepoint(_callable_context_manager): def __init__(self, db, sid=None): self.db = db _compiler = db.compiler() self.sid = sid or 's' + uuid.uuid4().hex self.quoted_sid = _compiler.quote(self.sid) def _execute(self, query): self.db.execute_sql(query, require_commit=False) def commit(self): self._execute('RELEASE SAVEPOINT %s;' % self.quoted_sid) def rollback(self): self._execute('ROLLBACK TO SAVEPOINT %s;' % self.quoted_sid) def __enter__(self): self._orig_autocommit = self.db.get_autocommit() self.db.set_autocommit(False) self._execute('SAVEPOINT %s;' % self.quoted_sid) return self def __exit__(self, exc_type, exc_val, exc_tb): try: if exc_type: self.rollback() else: try: self.commit() except: self.rollback() raise finally: self.db.set_autocommit(self._orig_autocommit) class savepoint_sqlite(savepoint): def __enter__(self): conn = self.db.get_conn() # For sqlite, the connection's isolation_level *must* be set to None. # The act of setting it, though, will break any existing savepoints, # so only write to it if necessary. if conn.isolation_level is not None: self._orig_isolation_level = conn.isolation_level conn.isolation_level = None else: self._orig_isolation_level = None return super(savepoint_sqlite, self).__enter__() def __exit__(self, exc_type, exc_val, exc_tb): try: return super(savepoint_sqlite, self).__exit__( exc_type, exc_val, exc_tb) finally: if self._orig_isolation_level is not None: self.db.get_conn().isolation_level = self._orig_isolation_level class FieldProxy(Field): def __init__(self, alias, field_instance): self._model_alias = alias self.model = self._model_alias.model_class self.field_instance = field_instance def clone_base(self): return FieldProxy(self._model_alias, self.field_instance) def coerce(self, value): return self.field_instance.coerce(value) def python_value(self, value): return self.field_instance.python_value(value) def db_value(self, value): return self.field_instance.db_value(value) def __getattr__(self, attr): if attr == 'model_class': return self._model_alias return getattr(self.field_instance, attr) class ModelAlias(object): def __init__(self, model_class): self.__dict__['model_class'] = model_class def __getattr__(self, attr): model_attr = getattr(self.model_class, attr) if isinstance(model_attr, Field): return FieldProxy(self, model_attr) return model_attr def __setattr__(self, attr, value): raise AttributeError('Cannot set attributes on ModelAlias instances') def get_proxy_fields(self, declared_fields=False): mm = self.model_class._meta fields = mm.declared_fields if declared_fields else mm.sorted_fields return [FieldProxy(self, f) for f in fields] def select(self, *selection): if not selection: selection = self.get_proxy_fields() query = SelectQuery(self, *selection) if self._meta.order_by: query = query.order_by(*self._meta.order_by) return query def __call__(self, **kwargs): return self.model_class(**kwargs) if _SortedFieldList is None: class _SortedFieldList(object): __slots__ = ('_keys', '_items') def __init__(self): self._keys = [] self._items = [] def __getitem__(self, i): return self._items[i] def __iter__(self): return iter(self._items) def __contains__(self, item): k = item._sort_key i = bisect_left(self._keys, k) j = bisect_right(self._keys, k) return item in self._items[i:j] def index(self, field): return self._keys.index(field._sort_key) def insert(self, item): k = item._sort_key i = bisect_left(self._keys, k) self._keys.insert(i, k) self._items.insert(i, item) def remove(self, item): idx = self.index(item) del self._items[idx] del self._keys[idx] class DoesNotExist(Exception): pass if sqlite3: default_database = SqliteDatabase('peewee.db') else: default_database = None class ModelOptions(object): def __init__(self, cls, database=None, db_table=None, db_table_func=None, indexes=None, order_by=None, primary_key=None, table_alias=None, constraints=None, schema=None, validate_backrefs=True, only_save_dirty=False, **kwargs): self.model_class = cls self.name = cls.__name__.lower() self.fields = {} self.columns = {} self.defaults = {} self._default_by_name = {} self._default_dict = {} self._default_callables = {} self._default_callable_list = [] self._sorted_field_list = _SortedFieldList() self.sorted_fields = [] self.sorted_field_names = [] self.valid_fields = set() self.declared_fields = [] self.database = database if database is not None else default_database self.db_table = db_table self.db_table_func = db_table_func self.indexes = list(indexes or []) self.order_by = order_by self.primary_key = primary_key self.table_alias = table_alias self.constraints = constraints self.schema = schema self.validate_backrefs = validate_backrefs self.only_save_dirty = only_save_dirty self.auto_increment = None self.composite_key = False self.rel = {} self.reverse_rel = {} for key, value in kwargs.items(): setattr(self, key, value) self._additional_keys = set(kwargs.keys()) if self.db_table_func and not self.db_table: self.db_table = self.db_table_func(cls) def __repr__(self): return '<%s: %s>' % (self.__class__.__name__, self.name) def prepared(self): if self.order_by: norm_order_by = [] for item in self.order_by: if isinstance(item, Field): prefix = '-' if item._ordering == 'DESC' else '' item = prefix + item.name field = self.fields[item.lstrip('-')] if item.startswith('-'): norm_order_by.append(field.desc()) else: norm_order_by.append(field.asc()) self.order_by = norm_order_by def _update_field_lists(self): self.sorted_fields = list(self._sorted_field_list) self.sorted_field_names = [f.name for f in self.sorted_fields] self.valid_fields = (set(self.fields.keys()) | set(self.fields.values()) | set((self.primary_key,))) self.declared_fields = [field for field in self.sorted_fields if not isinstance(field, _AutoPrimaryKeyField)] def add_field(self, field): self.remove_field(field.name) self.fields[field.name] = field self.columns[field.db_column] = field self._sorted_field_list.insert(field) self._update_field_lists() if field.default is not None: self.defaults[field] = field.default if callable(field.default): self._default_callables[field] = field.default self._default_callable_list.append((field.name, field.default)) else: self._default_dict[field] = field.default self._default_by_name[field.name] = field.default def remove_field(self, field_name): if field_name not in self.fields: return original = self.fields.pop(field_name) del self.columns[original.db_column] self._sorted_field_list.remove(original) self._update_field_lists() if original.default is not None: del self.defaults[original] if self._default_callables.pop(original, None): for i, (name, _) in enumerate(self._default_callable_list): if name == field_name: self._default_callable_list.pop(i) break else: self._default_dict.pop(original, None) self._default_by_name.pop(original.name, None) def get_default_dict(self): dd = self._default_by_name.copy() for field_name, default in self._default_callable_list: dd[field_name] = default() return dd def get_field_index(self, field): try: return self._sorted_field_list.index(field) except ValueError: return -1 def get_primary_key_fields(self): if self.composite_key: return [ self.fields[field_name] for field_name in self.primary_key.field_names] return [self.primary_key] def rel_for_model(self, model, field_obj=None, multi=False): is_field = isinstance(field_obj, Field) is_node = not is_field and isinstance(field_obj, Node) if multi: accum = [] for field in self.sorted_fields: if isinstance(field, ForeignKeyField) and field.rel_model == model: is_match = ( (field_obj is None) or (is_field and field_obj.name == field.name) or (is_node and field_obj._alias == field.name)) if is_match: if not multi: return field accum.append(field) if multi: return accum def reverse_rel_for_model(self, model, field_obj=None, multi=False): return model._meta.rel_for_model(self.model_class, field_obj, multi) def rel_exists(self, model): return self.rel_for_model(model) or self.reverse_rel_for_model(model) def related_models(self, backrefs=False): models = [] stack = [self.model_class] while stack: model = stack.pop() if model in models: continue models.append(model) for fk in model._meta.rel.values(): stack.append(fk.rel_model) if backrefs: for fk in model._meta.reverse_rel.values(): stack.append(fk.model_class) return models class BaseModel(type): inheritable = set([ 'constraints', 'database', 'db_table_func', 'indexes', 'order_by', 'primary_key', 'schema', 'validate_backrefs', 'only_save_dirty']) def __new__(cls, name, bases, attrs): if name == _METACLASS_ or bases[0].__name__ == _METACLASS_: return super(BaseModel, cls).__new__(cls, name, bases, attrs) meta_options = {} meta = attrs.pop('Meta', None) if meta: for k, v in meta.__dict__.items(): if not k.startswith('_'): meta_options[k] = v model_pk = getattr(meta, 'primary_key', None) parent_pk = None # inherit any field descriptors by deep copying the underlying field # into the attrs of the new model, additionally see if the bases define # inheritable model options and swipe them for b in bases: if not hasattr(b, '_meta'): continue base_meta = getattr(b, '_meta') if parent_pk is None: parent_pk = deepcopy(base_meta.primary_key) all_inheritable = cls.inheritable | base_meta._additional_keys for (k, v) in base_meta.__dict__.items(): if k in all_inheritable and k not in meta_options: meta_options[k] = v for (k, v) in b.__dict__.items(): if k in attrs: continue if isinstance(v, FieldDescriptor): if not v.field.primary_key: attrs[k] = deepcopy(v.field) # initialize the new class and set the magic attributes cls = super(BaseModel, cls).__new__(cls, name, bases, attrs) ModelOptionsBase = meta_options.get('model_options_base', ModelOptions) cls._meta = ModelOptionsBase(cls, **meta_options) cls._data = None cls._meta.indexes = list(cls._meta.indexes) if not cls._meta.db_table: cls._meta.db_table = re.sub('[^\w]+', '_', cls.__name__.lower()) # replace fields with field descriptors, calling the add_to_class hook fields = [] for name, attr in cls.__dict__.items(): if isinstance(attr, Field): if attr.primary_key and model_pk: raise ValueError('primary key is overdetermined.') elif attr.primary_key: model_pk, pk_name = attr, name else: fields.append((attr, name)) composite_key = False if model_pk is None: if parent_pk: model_pk, pk_name = parent_pk, parent_pk.name else: model_pk, pk_name = PrimaryKeyField(primary_key=True), 'id' elif isinstance(model_pk, CompositeKey): pk_name = '_composite_key' composite_key = True if model_pk is not False: model_pk.add_to_class(cls, pk_name) cls._meta.primary_key = model_pk cls._meta.auto_increment = ( isinstance(model_pk, PrimaryKeyField) or bool(model_pk.sequence)) cls._meta.composite_key = composite_key for field, name in fields: field.add_to_class(cls, name) # create a repr and error class before finalizing if hasattr(cls, '__unicode__'): setattr(cls, '__repr__', lambda self: '<%s: %r>' % ( cls.__name__, self.__unicode__())) exc_name = '%sDoesNotExist' % cls.__name__ exc_attrs = {'__module__': cls.__module__} exception_class = type(exc_name, (DoesNotExist,), exc_attrs) cls.DoesNotExist = exception_class cls._meta.prepared() if hasattr(cls, 'validate_model'): cls.validate_model() DeferredRelation.resolve(cls) return cls def __iter__(self): return iter(self.select()) class Model(with_metaclass(BaseModel)): def __init__(self, *args, **kwargs): self._data = self._meta.get_default_dict() self._dirty = set(self._data) self._obj_cache = {} for k, v in kwargs.items(): setattr(self, k, v) @classmethod def alias(cls): return ModelAlias(cls) @classmethod def select(cls, *selection): query = SelectQuery(cls, *selection) if cls._meta.order_by: query = query.order_by(*cls._meta.order_by) return query @classmethod def update(cls, __data=None, **update): fdict = __data or {} fdict.update([(cls._meta.fields[f], update[f]) for f in update]) return UpdateQuery(cls, fdict) @classmethod def insert(cls, __data=None, **insert): fdict = __data or {} fdict.update([(cls._meta.fields[f], insert[f]) for f in insert]) return InsertQuery(cls, fdict) @classmethod def insert_many(cls, rows, validate_fields=True): return InsertQuery(cls, rows=rows, validate_fields=validate_fields) @classmethod def insert_from(cls, fields, query): return InsertQuery(cls, fields=fields, query=query) @classmethod def delete(cls): return DeleteQuery(cls) @classmethod def raw(cls, sql, *params): return RawQuery(cls, sql, *params) @classmethod def create(cls, **query): inst = cls(**query) inst.save(force_insert=True) inst._prepare_instance() return inst @classmethod def get(cls, *query, **kwargs): sq = cls.select().naive() if query: sq = sq.where(*query) if kwargs: sq = sq.filter(**kwargs) return sq.get() @classmethod def get_or_create(cls, **kwargs): defaults = kwargs.pop('defaults', {}) query = cls.select() for field, value in kwargs.items(): if '__' in field: query = query.filter(**{field: value}) else: query = query.where(getattr(cls, field) == value) try: return query.get(), False except cls.DoesNotExist: try: params = dict((k, v) for k, v in kwargs.items() if '__' not in k) params.update(defaults) with cls._meta.database.atomic(): return cls.create(**params), True except IntegrityError as exc: try: return query.get(), False except cls.DoesNotExist: raise exc @classmethod def create_or_get(cls, **kwargs): try: with cls._meta.database.atomic(): return cls.create(**kwargs), True except IntegrityError: query = [] # TODO: multi-column unique constraints. for field_name, value in kwargs.items(): field = getattr(cls, field_name) if field.unique or field.primary_key: query.append(field == value) return cls.get(*query), False @classmethod def filter(cls, *dq, **query): return cls.select().filter(*dq, **query) @classmethod def table_exists(cls): kwargs = {} if cls._meta.schema: kwargs['schema'] = cls._meta.schema return cls._meta.db_table in cls._meta.database.get_tables(**kwargs) @classmethod def create_table(cls, fail_silently=False): if fail_silently and cls.table_exists(): return db = cls._meta.database pk = cls._meta.primary_key if db.sequences and pk is not False and pk.sequence: if not db.sequence_exists(pk.sequence): db.create_sequence(pk.sequence) db.create_table(cls) cls._create_indexes() @classmethod def _fields_to_index(cls): fields = [] for field in cls._meta.sorted_fields: if field.primary_key: continue requires_index = any(( field.index, field.unique, isinstance(field, ForeignKeyField))) if requires_index: fields.append(field) return fields @classmethod def _index_data(cls): return itertools.chain( [((field,), field.unique) for field in cls._fields_to_index()], cls._meta.indexes or ()) @classmethod def _create_indexes(cls): for field_list, is_unique in cls._index_data(): cls._meta.database.create_index(cls, field_list, is_unique) @classmethod def _drop_indexes(cls, safe=False): for field_list, is_unique in cls._index_data(): cls._meta.database.drop_index(cls, field_list, safe) @classmethod def sqlall(cls): queries = [] compiler = cls._meta.database.compiler() pk = cls._meta.primary_key if cls._meta.database.sequences and pk.sequence: queries.append(compiler.create_sequence(pk.sequence)) queries.append(compiler.create_table(cls)) for field in cls._fields_to_index(): queries.append(compiler.create_index(cls, [field], field.unique)) if cls._meta.indexes: for field_names, unique in cls._meta.indexes: fields = [cls._meta.fields[f] for f in field_names] queries.append(compiler.create_index(cls, fields, unique)) return [sql for sql, _ in queries] @classmethod def drop_table(cls, fail_silently=False, cascade=False): cls._meta.database.drop_table(cls, fail_silently, cascade) @classmethod def truncate_table(cls, restart_identity=False, cascade=False): cls._meta.database.truncate_table(cls, restart_identity, cascade) @classmethod def as_entity(cls): if cls._meta.schema: return Entity(cls._meta.schema, cls._meta.db_table) return Entity(cls._meta.db_table) @classmethod def noop(cls, *args, **kwargs): return NoopSelectQuery(cls, *args, **kwargs) def _get_pk_value(self): return getattr(self, self._meta.primary_key.name) get_id = _get_pk_value # Backwards-compatibility. def _set_pk_value(self, value): if not self._meta.composite_key: setattr(self, self._meta.primary_key.name, value) set_id = _set_pk_value # Backwards-compatibility. def _pk_expr(self): return self._meta.primary_key == self._get_pk_value() def _prepare_instance(self): self._dirty.clear() self.prepared() def prepared(self): pass def _prune_fields(self, field_dict, only): new_data = {} for field in only: if field.name in field_dict: new_data[field.name] = field_dict[field.name] return new_data def _populate_unsaved_relations(self, field_dict): for key in self._meta.rel: conditions = ( key in self._dirty and key in field_dict and field_dict[key] is None and self._obj_cache.get(key) is not None) if conditions: setattr(self, key, getattr(self, key)) field_dict[key] = self._data[key] def save(self, force_insert=False, only=None): field_dict = dict(self._data) if self._meta.primary_key is not False: pk_field = self._meta.primary_key pk_value = self._get_pk_value() else: pk_field = pk_value = None if only: field_dict = self._prune_fields(field_dict, only) elif self._meta.only_save_dirty and not force_insert: field_dict = self._prune_fields( field_dict, self.dirty_fields) if not field_dict: self._dirty.clear() return False self._populate_unsaved_relations(field_dict) if pk_value is not None and not force_insert: if self._meta.composite_key: for pk_part_name in pk_field.field_names: field_dict.pop(pk_part_name, None) else: field_dict.pop(pk_field.name, None) rows = self.update(**field_dict).where(self._pk_expr()).execute() elif pk_field is None: self.insert(**field_dict).execute() rows = 1 else: pk_from_cursor = self.insert(**field_dict).execute() if pk_from_cursor is not None: pk_value = pk_from_cursor self._set_pk_value(pk_value) rows = 1 self._dirty.clear() return rows def is_dirty(self): return bool(self._dirty) @property def dirty_fields(self): return [f for f in self._meta.sorted_fields if f.name in self._dirty] def dependencies(self, search_nullable=False): model_class = type(self) query = self.select().where(self._pk_expr()) stack = [(type(self), query)] seen = set() while stack: klass, query = stack.pop() if klass in seen: continue seen.add(klass) for rel_name, fk in klass._meta.reverse_rel.items(): rel_model = fk.model_class if fk.rel_model is model_class: node = (fk == self._data[fk.to_field.name]) subquery = rel_model.select().where(node) else: node = fk << query subquery = rel_model.select().where(node) if not fk.null or search_nullable: stack.append((rel_model, subquery)) yield (node, fk) def delete_instance(self, recursive=False, delete_nullable=False): if recursive: dependencies = self.dependencies(delete_nullable) for query, fk in reversed(list(dependencies)): model = fk.model_class if fk.null and not delete_nullable: model.update(**{fk.name: None}).where(query).execute() else: model.delete().where(query).execute() return self.delete().where(self._pk_expr()).execute() def __hash__(self): return hash((self.__class__, self._get_pk_value())) def __eq__(self, other): return ( other.__class__ == self.__class__ and self._get_pk_value() is not None and other._get_pk_value() == self._get_pk_value()) def __ne__(self, other): return not self == other def clean_prefetch_subquery(query): query = query.clone() query._group_by = query._having = None return query def prefetch_add_subquery(sq, subqueries): fixed_queries = [PrefetchResult(sq)] for i, subquery in enumerate(subqueries): if isinstance(subquery, tuple): subquery, target_model = subquery else: target_model = None if not isinstance(subquery, Query) and issubclass(subquery, Model): subquery = subquery.select() subquery_model = subquery.model_class fks = backrefs = None for j in reversed(range(i + 1)): prefetch_result = fixed_queries[j] last_query = prefetch_result.query last_model = prefetch_result.model rels = subquery_model._meta.rel_for_model(last_model, multi=True) if rels: fks = [getattr(subquery_model, fk.name) for fk in rels] pks = [getattr(last_model, fk.to_field.name) for fk in rels] else: backrefs = last_model._meta.rel_for_model( subquery_model, multi=True) if (fks or backrefs) and ((target_model is last_model) or (target_model is None)): break if not (fks or backrefs): tgt_err = ' using %s' % target_model if target_model else '' raise AttributeError('Error: unable to find foreign key for ' 'query: %s%s' % (subquery, tgt_err)) if fks: cleaned = clean_prefetch_subquery(last_query) expr = reduce(operator.or_, [ (fk << cleaned.select(pk)) for (fk, pk) in zip(fks, pks)]) subquery = subquery.where(expr) fixed_queries.append(PrefetchResult(subquery, fks, False)) elif backrefs: cleaned = clean_prefetch_subquery(last_query) expr = reduce(operator.or_, [ (backref.to_field << cleaned.select(backref)) for backref in backrefs]) subquery = subquery.where(expr) fixed_queries.append(PrefetchResult(subquery, backrefs, True)) return fixed_queries __prefetched = namedtuple('__prefetched', ( 'query', 'fields', 'backref', 'rel_models', 'field_to_name', 'model')) class PrefetchResult(__prefetched): def __new__(cls, query, fields=None, backref=None, rel_models=None, field_to_name=None, model=None): if fields: if backref: rel_models = [field.model_class for field in fields] foreign_key_attrs = [field.to_field.name for field in fields] else: rel_models = [field.rel_model for field in fields] foreign_key_attrs = [field.name for field in fields] field_to_name = list(zip(fields, foreign_key_attrs)) model = query.model_class return super(PrefetchResult, cls).__new__( cls, query, fields, backref, rel_models, field_to_name, model) def populate_instance(self, instance, id_map): if self.backref: for field in self.fields: identifier = instance._data[field.name] key = (field, identifier) if key in id_map: setattr(instance, field.name, id_map[key]) else: for field, attname in self.field_to_name: identifier = instance._data[field.to_field.name] key = (field, identifier) rel_instances = id_map.get(key, []) dest = '%s_prefetch' % field.related_name for inst in rel_instances: setattr(inst, attname, instance) setattr(instance, dest, rel_instances) def store_instance(self, instance, id_map): for field, attname in self.field_to_name: identity = field.to_field.python_value(instance._data[attname]) key = (field, identity) if self.backref: id_map[key] = instance else: id_map.setdefault(key, []) id_map[key].append(instance) def prefetch(sq, *subqueries): if not subqueries: return sq fixed_queries = prefetch_add_subquery(sq, subqueries) deps = {} rel_map = {} for prefetch_result in reversed(fixed_queries): query_model = prefetch_result.model if prefetch_result.fields: for rel_model in prefetch_result.rel_models: rel_map.setdefault(rel_model, []) rel_map[rel_model].append(prefetch_result) deps[query_model] = {} id_map = deps[query_model] has_relations = bool(rel_map.get(query_model)) for instance in prefetch_result.query: if prefetch_result.fields: prefetch_result.store_instance(instance, id_map) if has_relations: for rel in rel_map[query_model]: rel.populate_instance(instance, deps[rel.model]) return prefetch_result.query def create_model_tables(models, **create_table_kwargs): """Create tables for all given models (in the right order).""" for m in sort_models_topologically(models): m.create_table(**create_table_kwargs) def drop_model_tables(models, **drop_table_kwargs): """Drop tables for all given models (in the right order).""" for m in reversed(sort_models_topologically(models)): m.drop_table(**drop_table_kwargs)