Interesting paper detailing a method to industrialize software
potatohead
Posts: 10,261
http://codevalley.com/whitepaper.pdf
I'm linking this here, one out of general interest, but also for our up and coming P2 and it's encryption / code protect capabilities.
The proposal centers in on means to combine discrete and specialized software without exposing IP. Open code people really won't like this much, and that's fine. I'm not sure I do, but the ideas in this are worth general consideration, even when one is opposed to the concept of industrialization.
My guess is some of you will find aspects of this very highly controversial, while others may find it brings you some potential thoughts on ways things can be done. I'm in the latter camp, more than the former.
The core thrust of the method is concatenation. Everyone requests space in the software project, and they get addresses and other necessary touch points, much like the mailboxes we use on a Propeller today. As deliverables are completed, the project can be completed and executed. Think ELF, for example, as a framework and object code as components and you are on the track these guys are on.
And I find that interesting.
At the top, somebody writes a header, and they then literally cat all the deliverables together and go!
I'm linking this here, one out of general interest, but also for our up and coming P2 and it's encryption / code protect capabilities.
The proposal centers in on means to combine discrete and specialized software without exposing IP. Open code people really won't like this much, and that's fine. I'm not sure I do, but the ideas in this are worth general consideration, even when one is opposed to the concept of industrialization.
My guess is some of you will find aspects of this very highly controversial, while others may find it brings you some potential thoughts on ways things can be done. I'm in the latter camp, more than the former.
The core thrust of the method is concatenation. Everyone requests space in the software project, and they get addresses and other necessary touch points, much like the mailboxes we use on a Propeller today. As deliverables are completed, the project can be completed and executed. Think ELF, for example, as a framework and object code as components and you are on the track these guys are on.
And I find that interesting.
At the top, somebody writes a header, and they then literally cat all the deliverables together and go!
Comments
It is my conviction that the incessant demand for IP protection and the resulting closed source software is why we don't have an "industrialized" component model.
Sure you can define interfaces between components. That work over some network communication media. That allows mixing of open and closed source components and protection of IP. Until that is I want all those components to run on my machine and find that they are all built for a different architecture or operating system than I have.
Then I would ask: What software crisis? Such a crisis was discussed a lot in the 1960's and 80's. Heck, now we have thousands of times more programmers than then. It's staggering all the new stuff that is coming every day.
The software crisis is created by totally underestimating the effort required to do whatever it is in software. An extreme example of this was the optimism for Artificial Intelligence back in the day. It seemed like it was so near. We are still waiting...
Then I would say software creation is not comparable to industrial mass production. Mass production is all about how to create millions of the same thing as efficiently and cheaply as possible. Well, we already do that, there are programs and libraries and source codes everywhere that you can duplicate instantly and for free. Job done.
Actually writing software is about creating something that has never existed before. How can you ever estimate the complexity of it or the time it takes to create? It's like speccing out a mathematical problem and then asking a bunch of mathematicians to tender for solving the problem. Might take take two minutes, might take hundreds of years!
Anyway, in terms of achieving that component model, the World Wide Web is doing very well. Let's take this forum as an example:
1) Forum software from wherever.
2) Avatars from Gravatar
3) Search from Google
4) Potentially sign up and login from Facebook or whoever.
6) Libraries from jQuery
etc etc etc.
In this way the few guys at Parallax are leveraging millions of man hours to get the job done.
If you have a specialization put an API for it on the web and away we go.
"my Smile meter just exploded"
It's a paper written by MBA's to scam other MBA's
It's notable that there is no code on codevalley.com
Take away the network and what does it all look like?
More in a while.
Blown schedules and crippling losses are due to poor planning and poor process, this well known and has been solved ages ago.
That is, only base schedules on how long it took last time, and if there was no last time, then it takes as long as it takes. Therefore, schedule small things until there are enough to do a big thing. Do not permit requirements creep.
Pretty frickin' simple. The big boys with mature process do this all the time with excellent predictable results. Those with immature process ignore this consistently, and have consistently poor results.
Its less costly to implement proper process than it is to waste effort on a magic bullet, but folks love the thought of magic bullets.
Then came the era of huge ambitious projects many of which failed. A crisis of failure rather than lack of man power.
That era is not over yet.
Can you say who are those "The big boys with mature process"? And what do they do?
Because what I am seeing is a lot of failures among big boys even today.
I agree with "schedule small things until there are enough to do a big thing."
However that has also been the been he basis of the "agile development" snake oil consultants for a long time now.
Example: Contract for modular rooms intended for northern Alberta are awarded to a company in southern California based on price.
Result: Insulation and heating system is not adequate for the low winter temperature. Ice builds up in un-insulated flue pipes and shuts down furnace.
That is only one of many similar problems I have seen over the years.
@Heater, yes! This is MBA type thinking, but it's also military / defense contractor type thinking too.
It's also about the "no network" model too. The proposed model of "construction" would not be depending on networks. It's about binary executables. Those may use a network, but the building of them would not.
The latter is what I found interesting. (spooks and generals) I don't think much of the proposal at all. I do think some variation of it will be attempted where code / IP secrecy / management is a significant part of the picture.
Think the polar opposite of open code and all that means.
Also think about software as discrete pieces, like legos, or the various parts of a rocket, or military machinery of some sort.
Here's an example of what they are getting at. A company here in the Portland area has really deep understanding of automobile drive train systems. If you want to off road something and have it be able to take a beating and perform, these guys are some of the best.
So they are often contracting with major auto companies who desire those features in their cars and mostly trucks.
Their deliverable is a set of mechanical components that perform their task and that will fit into the vehicle as it's being designed by others. They ship the documentation and models needed to make those parts, but they don't ship their IP in direct form, only the product of it.
That is what the authors of the paper are trying to get at, and they submit the only way to do that in software is to ship binary executable code. At the extreme, said code will not even require linking!
The deliverable gets defined, the software people think on it and make a request for binary space in the project. They are given an address and pointers to things they need to work with, such as input and output buffers, and they fill that space with a compiled version of their software that performs to the specification mutually agreed upon.
No source code will change hands, and they are free to obfuscate their executable too.
To me, this is a horror show! How would anyone know anything?
The answer is they won't, and it's a scheme of forced trust and tests.
In short, they want to do the same kinds of things with software that they do with buildings, and other physical components.
Bits of software become "parts" that can be assembled into a working whole. And by assembled, I mean concatenated into that working whole by literally stacking things up one by one, until it's executable.
At the start of the project, an ELF header gets made, and when it executes, it simply returns to the OS. That's it!
The planners would look through lists of software components much like they would mechanical things. Many of those would be common, like most mechanical things would be. Some would be novel, and so on.
They think they can somehow reduce software down to components that operate in a standardized way, much like nuts, bolts and threads do.
So they start handing out address space and pointers, others compile and deliver their pieces, and it all gets built out in that address space.
At the end of the project, all those pieces are combined into an ELF executable that does whatever it was supposed to do. All of this with no source code or IP changing hands, just data, pointers and addresses.
What I found most notable is the IP avoidance. They want the benefit of it and are willing to pay, but they don't want to share it.
Currently, it's not very possible to build big things without sharing cross domain expertise. They want that more possible with even less sharing.
I found this curious as a complete extreme, polar opposite to open code, and that's why I posted it here.
It's not just the MBA who is thinking this stuff. It's also the spook and the general too.
I've never had much luck with industrializing anything. About the only thing that I feel sure about being industrialized in nations, and that generally leads to a whole lot of pollution, a bunch of jealous neighboring nations, and so on.
Realistically, just swap the word 'industrialize' with 'exploit' and you might get clarity.
I actually spent one term in university writing papers that I purposely didn't understand and managed to get by will all 'B' grades. Some people are awed by elusive terminology.
Military and defense contractors deal with only one reality -- survival of a sustainable cash flow. Objective concepts are not part of their ecosystem. So this stuff is just trying to shake the money tree. Don't spend too much time trying to sort out credible meaning... it is likely adware to promote a conspiracy or just a chance to move one step closer to a higher degree.
Yes, some shaking of the money tree is happening.
That's not the point.
When they get funded and who they talk to often is the point.
The IP avoidance / management is a matter to be concerned about. A lot of the world recognizes software patents, as well as process patents. Some of the world does not yet.
Open code, as we like and know it, sees that stuff as toxic. I tend to follow efforts centering on anti-open as much as I do ones that promote open. I see this as one anti-open such effort.
A few go somewhere. Most don't. Ideally, this is in the class of "don't"
"Interesting" to me isn't about "hey, I want to use that" in this context as it is types and ways of thinking.
To your other comment on exploitation.
That's not in and of itself a bad thing. The nature of the exploitation, whether it's costs including external costs are being considered, and the products of it all are relevant.
I would counter with "design" as being a great check on raw, industrial exploitation. We don't always have to make the mess we do. Design applied all the way down to the core bits can yield processes, and by extension, industry that is very good for us.
Trouble is, design costs money as does doing the more right thing or better thing does. Some of us get that, treating people, our planet, and others well or as we would ourselves want to be treated. A lot of us don't, and so there you go.
I'm a critical person, but I work to avoid rabid cynicism. And that means taking an idea and mulling it over some. Once in a while, doing that yields something notable, useful or surprising. At the least, it can help understand others and their differences.
I'm also a realist and that means understanding the state of things now, then building from there more than it does from scratch, or brand new. Real change and I would submit, real benefits often happen incrementally. This bit, that bit, all add up, until it's notable. Then the masses will move and shift accordingly.
People love the idea of the big disruption. These clowns are thinking there is one still possible for software. Linking that to industrialization is kind of a mess, of course. But make no mistake, that idea of disruption is dangling out there with people thinking on it often. For what it's worth, that's true of just about any thing where large amounts of money are moving.
Here's one: The design of things to be hostile to DIY / indie repair. The "TV Repairman" is largely gone, replaced by lots of obscure parts, hidden tech manuals and references, copyrighted code and god only knows what else. There is huge money in this right now, and a lot of the value is artificial value. Ripe for disruption somehow. That disruption will be an expression of real value with an order of cost difference attached.
These guys see the construction of software in a similar way. I think they are wrong, but that is what they see. And of course, the amounts of money in play, given computer code will hit a trillion dollar global market cap here any day soon, means they will continue thinking these things. It's too much money not to. 500 billion dollars, just in the US, according to code.org
Seeing how all of that could play out means looking at things like this to see where people might be headed, what they want, priorities, and all sorts of other stuff.
On a technical note, the phrase, "concatenation as irreducible, core operation" seemed worth some thought. If one applies that as a constraint, what does building programs look like? Lots of neato ideas in there, even if the idea or purpose expressed here is a bit of bunk. (and I think it is)
So, P2. It's got code protect with encryption. At least it did. Haven't heard much about that for a while, but let's assume it's still in there.
A scheme like this would mean being able to build something and have it be an atomic, executable thing. It works on it's COG, or sits in it's address space, whatever right?
Want to buy one of those things? Supply the target key, and one so encrypted gets delivered, black box, input and output compliant and unit tested per test spec.
Anyway, that's it. Just some interesting ideas I thought might generate more discussion than they did.
Should we be spending our energies developing software systems to maximize profits and by necessity control by somebody . Or should we be tackling bigger issues?
Blah, I don't know.
What I do know is we humans make progress by sharing ideas. Every time knowledge is locked up to the benefit of the few we retard progress for the many.
Anyway, let me describe my current favourite development process:
I create my apps using Javascript running under node.js
I can cut and paste useful code from github or anywhere to help solve my problems
I can include coherent modules of JS code into my projects to help solve my problems.
Those modules could quite likely include closed source compiled C/C++ libraries or use binary executables. Which may be free or may cost money.
My application can make use of services on the net, that I may well have to pay for. Or not.
Every possible which way, closed or open, free or expensive, I can use it.
All along the way I just write Javascript. It's the glue I use to compose whatever I want.
Sounds to me like I am already use the solution these clowns are proposing every day!
Take the network and IP exchange away and what is left?
Seemed to me the whole thrust of the article was how to make an economy of software component suppliers. Like in the auto or aircraft industries. Closed or open source, network services or whatever it takes.
The dev set up I describe can work without a network. Obviously if people want to give me or sell me some software components in source or binary form then I have to be able to get it on some media. I also have to be able to deliver my finished product some how.
Take away the networking and IP exchange and we are back to the 1970's when everyone was recreating wheels and trying to make progress by themselves. Perhaps with some occasional input from a published paper or word of mouth from acquaintances.
Humans cannot function without a network and IP exchange. Whether it be word or mouth, papyrus, printed type, telegraph, or the internet.
I don't get your point.
Back to the gears and drive train I mentioned earlier:
You specify requirements, and the killer gear people return a gear system that will meet them, with a safety factor.
Those people do not share how they develop those gears.
Additionally, the gear people are given a place to work, and they may have a hand in that based on some discussions, and they are given the connecting locations, dimensions, etc...
The client takes those gears and puts them into the design, and that whole thing is often called design in place. If they get CAD models of the gears, they often just load them into the model, and see them in the right place, no transforms or other kinds of operations are necessary.
Northrup Grummand does this on big aerospace projects regularly. They don't care how it gets done, nor on what system, only that it does and they have a neutral representation of it. (Non propretary CAD model)
Another way to put this would be loading all the geometry files from all the people who created them would result in a product on the screen, regardless of how and on what system they were created.
In software, these guys are speculating on how that same thing can be done.
You are right about no IP being unworkable, just as the gear guys actually do have to deliver a gear, they do not have to deliver the process by which that specific gear was made. The software guys do actually have to deliver executable software, but do not have to deliver anything related to how it was made.
In CAD design, design in place is an old technique, and there are many modern and more powerful options. Same goes for neutral, geometry representations too.
But those are the two basic things needed to make a big product with a minimum of sharing of expertise, put simply.
In software, these guys are proposing an executable and address spaces are the minimum for doing the same sort of thing.
When I did that, I did it at a premium. I didn't do it often.
In the case of the gears, I can tell you those skills are hard won and rare. Limiting distribution of them means gainful employment for a lot of people. And it means your awesome, goes there no matter what, drive train costs more too.
The open guys would say, why not just make all drive trains awesome by sharing? Indeed. But then what does everyone do to make more than a basic living? There are days when I wonder about all of that, because I really do want to share and share alike, but some of the bigger, more wealthy kids won't always return the favor.
Not all skills and or means, ways, rules are like that, but plenty are.
In addition to a premium, there is the consideration of ones market. There are some skills I won't sell anywhere, and others I would, depending on the markets associated with them.
In the major verticals, this dilemma comes up all the time. NDA, non compete, trade secret, and you all know the list. It is an ugly state of things that can and does leave people in bad places for the wrong sharing of this and that.
I have had to turn down some employment opportunities because their domain overlapped with a skill I had gained apart from them. Accepting could have left it entangled with them for a considerable time.
And that is crazy! But real too.
But when a company is selling specialized gears, they can't sell their secret sauce and expect to stay in business for long.
The purpose of doing so seem to be to capitalize on the model created and accepted by society for the benefit of those that grasp how the process creates economic momentum.
So the paper presumes quite a bit in terms of world view and political orientation. It also may not work. It just may be that software industrialization might be a bit like herding cats. The software industry simply allows anyone to reprogram hardware according to what they desire.
If the results of software production are favorable, everyone will desire to use: some will copy illicitly, others will buy. And some will try to industrialize (take the innovation of out the hands of the originator and put it under the control of 'industry'')the results.
IOW, I am just cynical about what industrialization does for society over the long-run. Cottage industries disappear, people have to rely less on customers walking in their doors and have to lobby government for contract work. The whole scheme of free enterprise is upset and potentially threatened.
It seems we live in a world where the cafeteria/commissary lifestyle has been imposed upon us by economies of scale offered in industrializing restaurant and retail trade == think McDs and Walmarts.
Did you really want to eat at restaurants that have no table service, provide you with tables bolted to the floor, and have you fetch your own servings. And did you really want to shop for all and everything based on the cheapest priced one stop shopping?
At some point, it seems like we give up a lot of the pleasantness of lifestyle and economic self-determination when we put industrialization forth as the ideal. Thus, a disappearing merchant class.
In sum, the issues are political and polarized in a classic way between capitalism versus socialism.
So if I have gotten out of the bounds of Parallax Forum topic policies, forgive me. But I feel that the paper was somewhat out of bounds to begin with. We should be free to have a cottage industry offering creative software solutions. Society needs a strong backbone of small enterprises as people just want to take care of themselves without all the politics that big capital and big enterprise throw at them.
1) A bunch of software component suppliers.
2) A means of suppliers advertising their capability and purchasers advertising their requirements.
3) A means of supplying the goods whilst keeping the supplied goods out of the hands of anyone except the customer who has paid for it.
4) A means of paying for the components. Doing the business exchange. Keeping track of what is delivered and what is paid for.
Item 1) Bothers me because it seems to be a step backwards from industrialization. Moving from large scale well educated, trained and equipped work forces to the old days of cottage industry.
Item 3) bothers me as it implies closed source. It even implies DRM, signed code, copy protection. It implies using components I cannot see into and importantly cannot test properly or check for rogue behaviors.
Item 4) Is a revelation. I missed it before but looking again now it's right there in the top of the codevalley.com web page source: It reads as follows:
<meta name="description" content="Code Valley seeks to industrialise the software industry using a global supply-chain fuelled by bitcoin, capable of building immensely complex software." /><title>Code Valley: Peer-to-peer software...
It's also there in their PDF:
The size of the competing pool also has significant bearing on the intensity of competition within the pool. With the advent of technologies such as the Internet, with its cross-border reach, and Bitcoin, which provides border-less and instant wealth transfer, a global competing pool becomes practical. With careful design of the industrialised system, competition can be focused on key pressure points of software development. If vendor performance metrics such as speed, cost, performance, resource usage and rate of non-conformance are advertised to prospective clients, then competition will inevitably be driven by those metrics.
So that is what they want. That is the "big idea". A cottage industry of basement dwelling hackers and back street software outfits who can sell their wares around the world with little friction by using bitcoin or similar crypo-currency.
I'm presuming that the crypto technology used in such currencies is also to be used for ensuring DRM. In a way the very code you supply becomes a currency exchangeable over the net in a peer to peer fashion like bitcoin and for bitcoin.
So far, I don't see how any of that "big idea" is about improving software quality, or improving the software engineering process.
Bottom line: I don't see how things will ever get coordinated enough for their scheme to work. I don't see who would be mad enough to use random, unverifiable code in any serious applications. I find it hideously offensive to the ideals of Open Source Software.
I maintain that Open Source and open collaboration are the way to go. The role model for this is the world of Maths and Science. Progress is not made by hiding information and everyone endlessly reinventing the wheel in secret and in their own way.
It all smells like a pitch for venture capital for a nebulous and unworkable idea.
Here it is zipped.