Wednesday, July 22, 2015

Snapshots from OSCON


Our opening session, with tone-setting keynotes, began with a representative of the UK government explaining her enterprise's commitment to serving the UK.  Digital Services is leveraging open source by insisting on open standards.  These two drive each other.  How to best share street address information, geographic location and so on?

Now I'm in a talk on what every programmer should know about floating-point arithmetic, by Java Floating-Point Czar Emeritus, Joseph D. Darcy, currently with Oracle.  From here, I'll be heading to the HP sponsored lunch.

Before this talk, I enjoyed a first visit to the Expo Hall, heading straight for the O'Reilly booth on Debra's instructions to make sure our school's new catalog / brochure was displayed.  Yes it was, with Natalia on the cover.  Bravo.

The HP booth speaker wanted us to all know about HP's huge commitment to "keeping it open" (we got a free mug for listening).  The Facebook keynote was along the same lines, as was Allison Randal's talk:  it's not just out of altruism or some bleary-eyed thinking that companies embrace open source; they do so out of economic necessity.

Allison would like to see reluctant joiners becoming more effective users.  Holding on for dear life is less enjoyable and rewarding than contributing as a full participant.  Facebook:  the discipline required to make projects suitable for public consumption is likewise what makes them robust enough for in-house re-use.

At one point in the early days it looked like F/OSS might always be the hobbyist version playing catch up to the grown up stuff.  Whereas many proprietary solutions are best of breed, in some domains the free tools are also the only tools or simply the best tools available.

The Linux Foundation guy was super excited about containers, the next big thing in data center development.  Again, open standards is the name of the game, as the skeleton key unlocking our perennial dreams of total interoperability.

They call floating point numbers an "approximation" of the reals (ℝ), but since when did anyone multiply π times itself in pure real numbers?  The reals have always seemed pretty unreal to me.  To what precision do the real number people multiply π?  Real reals have no upper limit on precision right?

N ⊆ Z ⊆ Q.  That's a field.  Then came the leap to Algebraic Numbers as a subset of .  "They threw the guy over the boat who discovered Q was insufficient" (paraphrase).

The Lindesmann-Weierstrass theorem 1882 proved π was transcendental, not algebraic, so Real Numbers include both.  Another field.  ⊆  C ⊆ Quaternions (⊆ Octonians).  Surreal Numbers, invented by Conway ("the other Conway" some say, given OSCON began as the Perl Conference and ours is first-named Damian).  Donald Knuth wrote a novel entitled Surreal Numbers.

Floating point numbers need to be deterministic, reproducible etc., i.e. the rules need to be clear.  The significand is multiplied by 2 to some exponent.  All floating-point numbers are rational. CPU specifications often defer to IEEE 754.

Most of the talk was on the non-field properties of floating point numbers.  They're not associative for example.  Like in the 3-bit signficand "toy floating point" system introduced in the slides:  2.0 + (0.1 + 0.1) != (2.0 + 0.1) + 0.1.  Best to not use floats for money given a true Decimal type is more likely to obey established rules for rounding, that predate electronic computing.  Java and Python both offer extended precision Decimals, as do many other computer languages.

After lunch:  the future of mobile payments, by Jonathan LeBlanc from PayPal.  The payment industry is shifting to serving the mobile environment in a big way.  Location and habit awareness, browser uniqueness, device fingerprinting, all help with user authentication.  When a user deviates from patterns and falls outside the trust zone as a result, additional challenges may be provided to provide additional checks.

The key term this year seems to be "at scale" which means "not diminished" as in "using the full data set".  For example, graph analysis "at scale" implies doing something computationally intensive. Kenny Bastani showed us how to use Docker to get Neo4j talking to Apache Spark to run PageRank and Centrality algorithms against toy amounts of graph data -- but he assured us the same techniques would work against all of Wikipedia (i.e. "at scale").

Item lost:  Neoprene case for the Mac Air.  Lets hope that's the extent of my losing stuff this year.  I dashed downtown to grab a replacement at the Apple Store then grabbed a couple pints at Yardhouse, adjacent, before reboarding Max to return to the Expo Hall.  I ended up talking to a satellite guy with UCAR in the process of developing his Python chops, coming from Perl.

Steve, Cynthia, Patrick and I took the Max back to my car and ended up on my back deck, talking over events of the day.  Patrick gives a talk tomorrow.