At the moment, I think programming and
software engineering are headed in a direction where they will "take
what they can get" in terms of advancing their field. While
everyone seems to agree that some degree of automation to the process
would be greatly beneficial, it doesn't seem like this is about to
pop up. On the other hand, small features help a little bit and come
up every once in awhile, but aren't the "one big thing"
that really propells either field forward. As a result, I don't see
everyone necessarily "dog piling" on one particular grand
idea that will revoluntionize everything – everyone is simply
trying to improve.
I think that as the age draws near
where anything and everything from lights, doors, locks, appliances
and the like all have embedded systems, that some rough form of
"accepted default plan" will emerge for the software to run
certain systems. For instance, it's possible "pool lights"
will have a very loose minimal content package that will provide a
bare minimum of functions to be implemented. I imagine similar things
for revolving doors, the gates on the subway, etc, to prepare for a
time where everything has a computer in it, but still needs to be
specialized to some degree. That there is some rough base to start
off of ensures that no "basic" feature is simply forgotten
– it is omitted by choice. They could imply a methodology of design
or just a really basic shell of code that is easily modified.
As for the programming environment
shown off in Future of Programming,
I'm unsure. While convienience is something that comes to mind,
security seems to be a big issue. A college student doing homework
with a group may really find this useful – especially in regards to
installing software he'll only use for one class – but big
corporations getting into this seems unlikely. The very notion and
basic idea of this project means that your data is in the hands of
someone else – and if you're making something big and secret for
you're company, the kind of people with access to this system would
be the kind of people who could make something out of any information
gleaned. They also mentioned that you could ssh into other servers
you own from the command prompt in the software. If this feature is
utilized often, then they have data to access other data that isn't
even on their servers.
While it does some have nice features
that let you put all of your code into "scope" – zooming
features, tracing, etc, I somehow doubt these will "revolutionize"
much of anything in programming. In addition, these features are
something that is doubtful to remain exclusive to their system for
long. And while the program may work on their systems, the programs
must still be taken off and compiled in order to be tested to run on
other target machines. How will the servers stand up to heavy load,
either by a multitude of testers or a manageable number with huge
projects? Will it simply slow, or will data be destroyed?
I think that their project definitely
has a place – somewhere. I doubt it's in the hands of every
programmer though. The video supplied was released on June 29, 2012.
Over a year has passed and before now I'd never heard of it – if
this was going to be the next big thing now, I would imagine I'd at
least heard a whimper by now.
No comments:
Post a Comment