Not everyone knows what I mean, along with others, by "dot notation." This way of noting states and behaviors was developed around "objects" in computer languages, many of which underwent a revolution in the 1970s and 1980s after the appearance of Smalltalk. The object-oriented way of thinking about programming was a revelation.
I was on the xBase track, i.e. the sequence of languages beginning with dBase II (which had precursors in Cal Tech's JPL) and going in various directions through the Visual FoxPros, of which version 9 is the latest. Our switch to OO in xBase occured around the time Microsoft bought the product, and rewrote it to support an OO GUI (this product has always been something of a skeleton in the closet, as it's more powerful than Access or VB, but marketing always kills its marketing budget).
Anyway, dot notation looks like this:
>>> mydog = Dog("Fido")
>>> mydog.name
"Fido"
>>> mydog.bark(2)
"bark, bark"
>>> mydog.bark(3)
"bark, bark, bark"
Some human user is entering strings after the >>> marks (the prompt) with the interpreter taking over upon a press of the Enter key, and coming back with some response (or not) on the line below. The above looks like Python, but many languages do it similarly, including this use of the dot (period) as a separator between the object (mydog in this example) and its states and behaviors.
Objects come into existence through their class constructors, where the class is the blueprint or generic case of an object. On birth, an object gets a slice of computer memory and private variables, there to individualize. A garbage collector looks for abandoned objects and frees their resources.
During an object's life span, dot notation gives programmers a way to trigger behavior, consult or set state. Typically, the programmer will wire behaviors to events, such as a mouse click on a certain button. Objects listen or subscribe to a kind of Events News (part of a loop or process on the CPU) and react to a subset of them, according to their class definitions. That's the OO model in a nutshell (also sounds a lot like the economy), except I haven't really described how we develop/program within an extensible type system, taking many blueprints for granted, and creating new class definitions by means of inheritance, interfaces and composition, per the Java/C# example.
In my recent proposal to Europython, we develop a curriculum which phases dot notation into K-12, because these design concepts have matured to a point where we can count on kids having recourse to them on the job, at the work place. OO is well-nigh universal as a paradigm (which does not mean other paradigms don't exist or are not valued -- it's just that we understand the OO style, even if we choose to use something else from time to time). I've proposed that we do more of this in math class, by building and then using rational numbers, integers modulo N, vectors, matrices, polyhedra as objects (defined by extension as new classes, types -- and optionally in Python). This CS-infused math/CS hybrid would run parallel to the traditional pre-calc/calc track, criss-crossing it in various ways [sic.], per trig, stats, measurement and the like.