Mathematically rigorous .NET programming
- To: mathgroup at smc.vnet.net
- Subject: [mg45148] Mathematically rigorous .NET programming
- From: "Steve S. Shaw" <steve at shawstudio.com>
- Date: Thu, 18 Dec 2003 06:55:22 -0500 (EST)
- Sender: owner-wri-mathgroup at wolfram.com
(Offshoot from [mg45085] NETLink - CREATING a new class? ) On further reflection, "A Mathematica -> .NET" compiler is not quite what I want. While I use Mathematica daily, what I primarily do is .NET programming, in an O-O style. I would love to infuse "O-O .NET programming" with some of Mathematica's nature. Plus some stuff that isn't often done, but that becomes practical given a combination of reflective O-O and symbolic pattern-matching. First, truly INTERACTIVE development. Poking live data, rather than making a change, and then starting a program running from scratch, over and over. (Visual Studio.NET's debugger CAN make a change in a running program and continue, which is part of what is needed - but is still a long way from the "feel" of interactive development.) Second, static and dynamic ANALYSIS of algorithms and data. Including EXPLORING THE HISTORY of how some data reached a certain state: where [in the logic] did each of these objects originate? What path did each object take thru the code? What other objects influenced the current state of a given object? - - - - - - - - - - Third, a mathematically rigorous representation of algorithms. Building on O-O features found in Eiffel and Java (and available in C# now, or in 2004): streams, filters, adaptors; abstract types ("interfaces" in Java); postconditions & preconditions; generic types. Adding some features from proof logic and electronic circuit design: heirarchical state machines; start-up phase vs normal operating state (in O-O terms, some methods should only be called while a net of objects is being first created, or being modified, others should only be called while the net is in fully operational state); multi-types / constrained types / coupled types (precise descriptions of permissible data structures/patterns/conditions); coverage of all cases (an extension of preconditions into the innards of an algorithm - proof that at any branch, all cases are EXPLICITLY handled - not permissible to say "if x > 0 then ...", without also saying what happens "if x <= 0". In modern O-O, there is a very common failure to clarify the handling of "null" values); master clocks and buffered inputs / dependencies / propagation of change (a functional alternative to Von Neumann machines, which surfaces any conflicts - the vast majority of logic is pure functional - reads current state & passes back transactions describing changes to make - conflicts between transaction requests from supposedly independent subsystems are obvious - algorithms are NOT written in the "linear sequential location-modifying" style of Von Neumann - instead, algorithms are written such that DEPENDENCIES are EXPLICIT - if two parts of a computation are not supposed to depend on each other, it can be proven - or at least demonstrated at run-time - whether they are interfering with each other); - - - - - - - - - - Hmm. Turned into a "manifesto" about ideas I've been noodling with, frustrated with the practical limitations of the available programming environments. Guess I'm trying to turn Mathematica into the premier environment for researching O-O programming ideas. But my motivation isn't "pure research". Rather, I believe in "pragmatic tinkering": Getting working code out the door to serve customers (and get paid), but in an environment where it is practical to spend a few days or months trying out new programming approaches. At minimum, this is a way to better understand a given algorithm, which then gets re-coded into an unreliable Von Neumann spaghetti-mess. But gradually, moving more and more code into "logic" - something that can actually be analyzed / reasoned about / understood by mortals. Food for thought... -- ToolmakerSteve