7535 Alignments – Two ways.

Since it is important to double check, I also used the raw PDF data to verify my findings.

The first 6 values are the cm matrix values for the Xobject. g is d+f and h is 612-g or 612-d-f. The first object is the background. Compare columns e and f with the first two columns in the following table, which were obtained by measurement. I missed one box in the measured series. I am sure Hermitian can find it and do his calculations to show it too matches up.

   a    b       c         d       e      f               g       h
798.72	0	0	614.4	-3.36	-1.2		613.2	-1.2
336.72	0	0	419.76	236.64	95.52		515.28	96.72
47.04	0	0	166.56	71.52	127.92		294.48	317.52
10.32	0	0	63.36	88.8	373.2		436.56	175.44
124.08	0	0	113.76	369.12	123.12		236.88	375.12
28.8	0	0	51.36	248.16	126		177.36	434.64
12.24	0	0	49.68	246.24	452.16		501.84	110.16
31.92	0	0	83.76	280.8	444.96		528.72	83.28
66.96	0	0	42.96	246.24	138.24		181.2	430.8
7.68	0	0	21.84	315.36	243.84		265.68	346.32
31.44	0	0	88.56	444	344.16		432.72	179.28
25.92	0	0	54.72	449.76	445.2		499.92	112.08
22.8	0	0	31.2	739.68	257.52		288.72	323.28
24.96	0	0	17.04	737.76	110.4		127.44	484.56
41.28	0	0	8.16	687.84	113.52		121.68	490.32
26.16	0	0	50.4	488.16	361.2		411.6	200.4
18.96	0	0	12	735.84	301.68		313.68	298.32

x_px=300*top_x/72
y_px=300*top_y/72
ofx=x_px+14
ofy=y_px+5

                          top x  top y  x_px y_px  ofx  ofy ofx/8 ofy/8
Mostly text              236.64  96.72   986  403  400  408   50    51
signature                 71.52 317.52   298 1323  312 1328   39   166
Date Apr 25               88.8  175.44   370  731  384  736   48    92
Block right Oahu/African 369.12 375.12  1538 1563 1552 1568  194   196
Aug 8 1961 Right         248.16 434.64  1034 1811 1048 1817  131   227
Aug 8 1961 left          246.24 110.16  1026  459 1040  464  130    58 
Above left date          280.8   83.28  1170  347 1184  352  148    44
Lower right blocks       246.24 430.8   1026 1795 1040 1800  130   225
None                     315.36 346.32  1314 1443 1328 1448  166   181
Maternity                444    179.28  1850  747 1864  752  233    94
Kapiolani                449.76 112.08  1874  467 1888  472  236    59
Right Top                737.76 484.56  3074 2019 3088 2024  386   253
Below right top          687.84 490.32  2866 2043 2880 2048  360   256
Triplet                  488.16 200.4   2034  835 2048  840  256   105
Below Center Top         735.84 298.32  3066 1243 3080 1248  385   156

Educating the Confused – Deflate

For some reason unknown, our almost resident expert Hermitian has suggested that FlateDecode is a lossy compression.

Wikipedia may be helpful in correcting that impression. Of course anyone familiar with ZLIB would have known and understood this.

Lossy compression examples for PDF’s include

  • DCTDecode – Aka JPEG, always lossy even at highest quality settings
  • JPXDecode – Lossless or lossy Wavelet based JPEG
  • JBIG2 – Lossless or lossy

And let me also explain why I believe most programs do not touch DCTDecoded data. It has been argued and observed that Preview maintains the exact DCTDecode stream. This is for obvious reasons, as they could either compress the resulting bitmap with high quality but then the filesize would explode, or they could add to the compression by recompressing the bitmap. I believe that Adobe tools are more destructive here.

The way a PDF Editor typically works is that it maintains two different ‘trees’. One contains the PDF tree with all the raw objects, the other contains the rendered information. When objects are not touched, their raw data is written back to avoid the problems with JPEG. This is also why Preview maintains the landscape orientation of the images.

I can appreciate that to someone who was recently introduced to PDF encoded data, that it may appear to be somewhat overwhelming and thus when seeing a /FlateDecode/DCTDecode statement for the filter, one may be initially confused by the order. However, a quick logical analysis of the two possibilities would lead one to quickly eliminate the flow where the bitmap was first zipped up and then the zip file was somehow encoded in a lossy fashion. Imagine the surprise when trying to deflate the data to find out it is no longer a valid encoding as DCT has managed to mess it all up.

Alternatively, one could also have read the PDF standard documentation which outlines the order.

Educating the Confused – Presentments

At OBC, the following issue is raised which, I believe, exposes another example of poor reading and/or comprehension skills. The issue is that presentments are constitutionally permitted but not required, and thus, lacking any rules implementing them is not at variance with our Constitution. It’s very simple really. It’s the same with the Secretary of State having the authority to determine eligibility but not a requirement to do so.  Perhaps the subtleties of the English language may be lost on some, but I see no real contradiction here. The Federal rules do not provide any authority for the Courts to accept presentments. The route through an indictment. Furthermore, the existence of a ‘citizens grand jury’ has limited foundation (understatement) in our laws. So, even if one could establish that Courts are required to accept presentments, the ‘grand jury’ which supposedly made these ‘presentments’ failed to abide by the rules that guide the Grand Jury. Which makes sense, or otherwise, any group of disgruntled people could establish a ‘grand jury’…

Punking the Puz’

I have been having some good time with Mario Apuzzo, whose musings where referred to by Judges as ‘lacking in merit’ and ‘academic only’.

It all started when I outlined how the appealing attorney had positioned the cause in error as “Did the lower court err in finding that Wong Kim Ark was a natural-born citizen?”. Mario responded claiming that I was making things up and that the term natural born was not even used in the lower Court ruling in Wong Kim Ark… Little did he know that I had transcribed the ruling in 2009 and placed it on my blog.

Continue reading