Where's Society Heading, Tech Wise
- Nitrofox125
- DBB Admiral
- Posts: 1848
- Joined: Sun Jul 07, 2002 2:01 am
- Location: Colorado Springs, CO, USA
- Contact:
Where's Society Heading, Tech Wise
With SpaceShipOne and stuff just coming to the forefront, where do you think we'll be in a hundred years or so? Which fictional society best portrays our future? Star Wars? Star Trek? Or something like The Day After Tomorrow? Just curious what ya'all are thinkin
- Mobius
- DBB_Master
- Posts: 7940
- Joined: Sun Jun 03, 2001 2:01 am
- Location: Christchurch, New Zealand
- Contact:
Well, the Singularity will occur sometime around 2050, so all bets are off after that date. "The Singularity" is defined as the point in time at which it is impossible for humans to keep up with the rate of technological change.
This means that the rate of change of technology will be so high, that no human can comprehend it - enhanced by artifical technology or not.
After the Singularity, the Earth could be ruled by 3rd or 4th generation AIs and humans won't have anything to do with inventing anything, or making anything ever again. (Look for first generation AI around 2025 or so)
How humans will adapt to this change (if the race even survives to 2050 - and the likelyhood of us going extinct before then is definitely non-zero!) will be unknown, but I can see at least 4 branches of humanity developing:
1) Neo-luddites. This culture will "settle" for a level of technology they are happy with and continue to use that level, and nothing higher, from that point on. These will be the Amish of the future.
2) Transhumans. This group will openly embrace any and all enhancements to human capabilities and functionalities. Look for human consciousness inside hardware, animal bodies (either biological or android). Also look for human braisn enhanaced with a LOT of "wet-ware" to enable these humans to control, manage and use ultra-advanced technology.
3) Post-humans. This group will completely abandon a biological form all together, and exist inside hardware, or custom android bodies. They will reject many human systems, morals and ethical behaviours. Exactly what this group will do to or the other two groups is unknown. These beings will be a synthesis of human and machine, and be capable of "magic". (All sufficiently advanced technology is indistinguishable from magic.)
4) e-humans. These "people" will consist of millions (and eventually trillions) of people who are now dead, but continue to live on in virtual worlds created especially for such a purpose.
Exactly what form of machine, or humanity "triumphs" over the others is not obvious to me (except it won't be the luddites.) I'd like to believe we could all peacefully co-exist, but knowing human nature, the strong will annihilate the meek, unless they decide to annhilate the galaxy first.
This means that the rate of change of technology will be so high, that no human can comprehend it - enhanced by artifical technology or not.
After the Singularity, the Earth could be ruled by 3rd or 4th generation AIs and humans won't have anything to do with inventing anything, or making anything ever again. (Look for first generation AI around 2025 or so)
How humans will adapt to this change (if the race even survives to 2050 - and the likelyhood of us going extinct before then is definitely non-zero!) will be unknown, but I can see at least 4 branches of humanity developing:
1) Neo-luddites. This culture will "settle" for a level of technology they are happy with and continue to use that level, and nothing higher, from that point on. These will be the Amish of the future.
2) Transhumans. This group will openly embrace any and all enhancements to human capabilities and functionalities. Look for human consciousness inside hardware, animal bodies (either biological or android). Also look for human braisn enhanaced with a LOT of "wet-ware" to enable these humans to control, manage and use ultra-advanced technology.
3) Post-humans. This group will completely abandon a biological form all together, and exist inside hardware, or custom android bodies. They will reject many human systems, morals and ethical behaviours. Exactly what this group will do to or the other two groups is unknown. These beings will be a synthesis of human and machine, and be capable of "magic". (All sufficiently advanced technology is indistinguishable from magic.)
4) e-humans. These "people" will consist of millions (and eventually trillions) of people who are now dead, but continue to live on in virtual worlds created especially for such a purpose.
Exactly what form of machine, or humanity "triumphs" over the others is not obvious to me (except it won't be the luddites.) I'd like to believe we could all peacefully co-exist, but knowing human nature, the strong will annihilate the meek, unless they decide to annhilate the galaxy first.
- Mr. Perfect
- DBB Fleet Admiral
- Posts: 2817
- Joined: Tue Apr 18, 2000 2:01 am
- Location: Cape May Court House, New Jersey.
- Contact:
- CDN_Merlin
- DBB_Master
- Posts: 9780
- Joined: Thu Nov 05, 1998 12:01 pm
- Location: Capital Of Canada
One thing I've noticed is that computers, no matter what their complexity, cannot even perfectly ANIMATE human motion, much less act human. Computers, no matter what their complexity, simply CANNOT truly simulate life. There will always be a difference between humans and computers, androids, and anything we come up with. It's the same difference as between a human and an animal. The animal just doesn't have the certain something present in a human that allows them to reason, devise, plot, think, and yes, compute. I do not think we'll be seeing robots with intelligence matching or surpassing the human brain in 100 years. Though I must admit, the human brain can be incredibly stupid at times (see the thread about dumb guy trying to convert lion to Christianity).
Mobi, you're an idiot. I second that STFU.
Stryker, I think you're very naive about the capabilities of the modern machine and what that might mean for future technological advances. They already surpass the human brain in many areas.
As much as I hate to admit it, but we're on the cusp of some very interesting and scary technologies. Some of which will lead us in very good directions. Some of which will lead us down very very dark and deadly directions.
It's just a matter of us putting into place the proper safe-guards to prevent certain things from getting out of hand.
Stryker, I think you're very naive about the capabilities of the modern machine and what that might mean for future technological advances. They already surpass the human brain in many areas.
As much as I hate to admit it, but we're on the cusp of some very interesting and scary technologies. Some of which will lead us in very good directions. Some of which will lead us down very very dark and deadly directions.
It's just a matter of us putting into place the proper safe-guards to prevent certain things from getting out of hand.
- Nitrofox125
- DBB Admiral
- Posts: 1848
- Joined: Sun Jul 07, 2002 2:01 am
- Location: Colorado Springs, CO, USA
- Contact:
I've always wondered if we will be able to simulate magic - control the very essence of the world we live in through technology.
And mebbe I'm the only one, but I'm truly interested in what Mobius has to say.
Dont'cha think that with the new emerging technology there will be new technology to help us understand the great span of technology..... take a web browser for instance. Example... a browser takes html, dhtml, php, asp, cgi, js, and all the other goodies and puts it into one readable display.
And mebbe I'm the only one, but I'm truly interested in what Mobius has to say.
Dont'cha think that with the new emerging technology there will be new technology to help us understand the great span of technology..... take a web browser for instance. Example... a browser takes html, dhtml, php, asp, cgi, js, and all the other goodies and puts it into one readable display.
- Viralphrame
- DBB Ace
- Posts: 419
- Joined: Thu Jan 30, 2003 3:01 am
- Contact:
I think Mobius has seen too much Ghost in the Shell.Mobius wrote:Well, the Singularity will occur sometime around 2050, so all bets are off after that date. "The Singularity" is defined as the point in time at which it is impossible for humans to keep up with the rate of technological change.
This means that the rate of change of technology will be so high, that no human can comprehend it - enhanced by artifical technology or not.
After the Singularity, the Earth could be ruled by 3rd or 4th generation AIs and humans won't have anything to do with inventing anything, or making anything ever again. (Look for first generation AI around 2025 or so)
How humans will adapt to this change (if the race even survives to 2050 - and the likelyhood of us going extinct before then is definitely non-zero!) will be unknown, but I can see at least 4 branches of humanity developing:
1) Neo-luddites. This culture will "settle" for a level of technology they are happy with and continue to use that level, and nothing higher, from that point on. These will be the Amish of the future.
2) Transhumans. This group will openly embrace any and all enhancements to human capabilities and functionalities. Look for human consciousness inside hardware, animal bodies (either biological or android). Also look for human braisn enhanaced with a LOT of "wet-ware" to enable these humans to control, manage and use ultra-advanced technology.
3) Post-humans. This group will completely abandon a biological form all together, and exist inside hardware, or custom android bodies. They will reject many human systems, morals and ethical behaviours. Exactly what this group will do to or the other two groups is unknown. These beings will be a synthesis of human and machine, and be capable of "magic". (All sufficiently advanced technology is indistinguishable from magic.)
4) e-humans. These "people" will consist of millions (and eventually trillions) of people who are now dead, but continue to live on in virtual worlds created especially for such a purpose.
Exactly what form of machine, or humanity "triumphs" over the others is not obvious to me (except it won't be the luddites.) I'd like to believe we could all peacefully co-exist, but knowing human nature, the strong will annihilate the meek, unless they decide to annhilate the galaxy first.
Re: Where's Society Heading, Tech Wise
on a political compass i'd say that startrek is rather utopian authoritian. starwars is generally liberal (near anarchy), with the "empire" being the authoritarian "control everything" odd one out.Nitrofox125 wrote:With SpaceShipOne and stuff just coming to the forefront, where do you think we'll be in a hundred years or so? Which fictional society best portrays our future? Star Wars? Star Trek? Or something like The Day After Tomorrow? Just curious what ya'all are thinkin
the message in "day after tomorrow" isn't relevant to technological advancement at all, not sure why you included it alongside the other 2.
i really can't imagine in secular terms where we'll be in the technological future. it could go so many ways. perhaps the human race will embrace another period of enlightenment and we'll start working together, that'l make for a nice startreky/starwarsy utopian future.
or perhaps we'll forever be stifled by our petty warring factions, familys, nations, religions, corporations. if this escalates more and more, i'd rather not think about our dark future. for the powers that be (or will be) will likely CONTINUE to stifle attempts to "rise above it all with enlightenment", as they are now.
- Vindicator
- DBB Benefactor
- Posts: 3166
- Joined: Mon Dec 16, 2002 3:01 am
- Location: southern IL, USA
- Contact:
- Darkside Heartless
- DBB Captain
- Posts: 562
- Joined: Tue Dec 09, 2003 3:01 am
- Location: Spring City PA
- Contact:
All this "technology taking over the world" stuff is not possible. Even the most powerful computer cannot do anything but what it's told to, exactly that and nothing more, any programmer can tell you that. take a look at action script(the only programming I know ) if you want to script a single button to scene 2, you need to tell it it's a button, that it links and to where it links, and what to do when it gets there. I've seen million dollar robots for exploring volcano's and the like, and they are barely capable of even walking let alone any self-motivation. There are robots that can "think" on their own(look up Mark Tilden) but they're no smarter then insects, even the most complex one, Roswell.
-
- DBB Fleet Admiral
- Posts: 2367
- Joined: Thu Jun 14, 2001 2:01 am
- Location: Israel
I think the bigger question is, when we really break out into space who is going to divide up areas that different nations will want to develop and/or plunder?
When man colonises Mars will it be a predetermined parcel of land for each space faring nation as determined by the U.N.? Or will it be whomever gets there first wins the big lotto? Will territorial turf battles be limited to space or will we see a WW3 break out over settlements on the moon or Mars.
Oh and Mobius...good post.
When man colonises Mars will it be a predetermined parcel of land for each space faring nation as determined by the U.N.? Or will it be whomever gets there first wins the big lotto? Will territorial turf battles be limited to space or will we see a WW3 break out over settlements on the moon or Mars.
Oh and Mobius...good post.
Mobius, in all seriousness, I have absolutely no idea where you get this "Singularity" idea. That just sounds like pure sci-fi to me. Honestly, would a person from 1950 be completely lost in today's technological world? Obviously, it would take a lot of explaining, but of course not. I don't really see technology evolving by leaps and bounds; in fact, in some respects, human technological development has slowed. Look at it this way: the automobile was invented in the late 1800s. More than 100 years later, we're still using the same fundamentally unchanged design as our primary mode of transportation. The same thing goes for trains/aircraft. As for space exploration, we were doing better in the Apollo era than we are now. Tech-wise, yes, computers keep getting faster, but the system I have now is not that different from the one I had 8 years ago. My life has not changed much at all over the 18 years of my life due to the advent of technology (better computers/the Internet being the notable exceptions there). I'm sorry, but I'm just not seeing any major improvement in overall human technology. Developing smaller and smaller cell phones with more and more useless features doesn't qualify as "progress," you know.
And if the post-modern horror you described ever did come to pass, the Earth would be better off destroyed. I'd be a neo-Luddite and damn proud of it; better to reject technology than to reject one's own humanity.
And if the post-modern horror you described ever did come to pass, the Earth would be better off destroyed. I'd be a neo-Luddite and damn proud of it; better to reject technology than to reject one's own humanity.
- Nitrofox125
- DBB Admiral
- Posts: 1848
- Joined: Sun Jul 07, 2002 2:01 am
- Location: Colorado Springs, CO, USA
- Contact:
the message I meant to include there was that technology and humanity would "burn out" either through resource consumption or pollution, etc humanity looses all progress. It was a bad movie to use as an example, but I think you can see what I mean now. DH, I disagree. You can tell a computer to think for itself basically... if that makes sense. a lower level AI can be programmed.the message in "day after tomorrow" isn't relevant to technological advancement at all, not sure why you included it alongside the other 2.
Mobi do you really honestly beleive that we will have a singularity by 2050? coz i see many problems along the path, and 2050 is a rather close date to speculate. i don't think you could really speculate anywhere near accurately on that until we actually see the first truly evolving intelligent AI. an AI that has great potential to better itself, within itself, like a human. THEN i'll start taking bets to when we reach singularity. but with all of the complexitys of the human consciousness still so un-understood, it's a long way off i think (unless we give the means for it to evolve itself outof the primordial binary ooze - perhaps 1000000 monkeys programming on laptops for a year will do it ).
If things were allowed to move forward at their current rate with no change of direction or government, or basic human greed - then our planet would be dead by 2100. Probably before that.
The grim reality is that money comes before environment, and without our environment being preserved, we will poison ourselves to death. Higher technology isn't the solution.
The grim reality is that money comes before environment, and without our environment being preserved, we will poison ourselves to death. Higher technology isn't the solution.
Let's put it this way. Machines are aides to humans. They're meant to help us by doing one thing better than we are able to do ourselves. Table saws cut wood faster than humans can chew it. A screw holds wood together longer and more easily than a human can. I have yet to see a machine that can think better than a human. Our best computers still can't beat grandmasters at chess 100% of the time. and that's with thousands of people doing applied research. I don't think anything is going to surpass the complexity, functionality, and flexibility of the human brain within 100 years.
The sudden -- and surprising -- end of the fossil fuel age will stun everyone -- and kill billions. Once the truth is told about gas and oil, your life will change forever.
Envision a world where freezing, starving people burn everything combustible -- everything from forests (releasing CO2; destroying topsoil and species); to garbage dumps (releasing dioxins, PCBs, and heavy metals); to people (by waging nuclear, biological, chemical, and conventional war); and you have seen the future.
"Peek Oil" is one theroy...
http://www.lifeaftertheoilcrash.net
http://dieoff.org <--originator of quote above
Envision a world where freezing, starving people burn everything combustible -- everything from forests (releasing CO2; destroying topsoil and species); to garbage dumps (releasing dioxins, PCBs, and heavy metals); to people (by waging nuclear, biological, chemical, and conventional war); and you have seen the future.
"Peek Oil" is one theroy...
http://www.lifeaftertheoilcrash.net
http://dieoff.org <--originator of quote above
-
- DBB Admiral
- Posts: 1557
- Joined: Sun Oct 07, 2001 2:01 am
- Location: Richmond,B. C., Canada
The future....Blade Runner to me. Climate a mess, technology advanced but usless to the little guy and genetic manipulation of organics the big thing.
Or...Brazil. Terroism so common as to be ho hum, bureaucracy so deeply in control that no one is in control, and rebel A/C mechanics are freedom's hope. (no fair guessing my livelyhood)
Or...Brazil. Terroism so common as to be ho hum, bureaucracy so deeply in control that no one is in control, and rebel A/C mechanics are freedom's hope. (no fair guessing my livelyhood)
When the oil runs out, which will certainly happen for the most part within the next 50 years at -still increasing- levels of usage, I would expect to see an incredible shock to the world economy, but as for killing billions, that I doubt. That's a very significant chunk of the world's population, and the Great Depression didn't do that.
But a new Depression will probably result, even bigger than the last one, and perhaps many millions might die. Unless, of course, people put serious interest in looking for alternatives.
That will happen increasingly as we near such a crash too. I'm sure people will start to notice ever-increasing oil prices.
I'm not thoroughly sure, myself, if artificial intelligence is even possible in a self-aware sense, at least using the fundamentals of the computers we have today. I suspect an entirely different architecture would be necessary if the computer was actually intended to think for itself.
Is that desirable anyway? Why would you want something capable of making its own decisions, especially if it held any real power?
But a new Depression will probably result, even bigger than the last one, and perhaps many millions might die. Unless, of course, people put serious interest in looking for alternatives.
That will happen increasingly as we near such a crash too. I'm sure people will start to notice ever-increasing oil prices.
I'm not thoroughly sure, myself, if artificial intelligence is even possible in a self-aware sense, at least using the fundamentals of the computers we have today. I suspect an entirely different architecture would be necessary if the computer was actually intended to think for itself.
Is that desirable anyway? Why would you want something capable of making its own decisions, especially if it held any real power?
- Nitrofox125
- DBB Admiral
- Posts: 1848
- Joined: Sun Jul 07, 2002 2:01 am
- Location: Colorado Springs, CO, USA
- Contact:
I agree to that. I think we'll need to have "analog" computers. We can send enough info back and forth to a brain (see rat brain flying an F16) and can create many body parts and cells, so I don't think it'll be long before we create brains for computers.I'm not thoroughly sure, myself, if artificial intelligence is even possible in a self-aware sense, at least using the fundamentals of the computers we have today. I suspect an entirely different architecture would be necessary if the computer was actually intended to think for itself.
I think this "slow hybrid transition" is stupid. The companies are trying to appease both the consumer and the big oil companies. Everyone's afraid of change. Large car companies are afraid of making an electric car and pissing off the oil companies, and consumers are afraid of buying something new. Even if it were a car you plugged into a wall outlet, where does that power come from? Probably from fossil-fuel generated power. A huge change needs to happen on the end of big companies, and it's probably not going to until it's absoloutly needed.
I wrote a story a while ago involving AI and it's place in the universe. I don't especially believe this, I just thought it was an intersting point: creation is a cycle. God was created by some higher power, He created humanity, and humanity is on the verge of being the next creators by creating a new race consciousness, AI. I'll post it in PTMC Gallery sometime.
(note bold type)Nitrofox125 wrote:...I think this "slow hybrid transition" is stupid. The companies are trying to appease both the consumer and the big oil companies. Everyone's afraid of change. Large car companies are afraid of making an electric car and pissing off the oil companies, and consumers are afraid of buying something new. Even if it were a car you plugged into a wall outlet, where does that power come from? Probably from fossil-fuel generated power. A huge change needs to happen on the end of big companies, and it's probably not going to until it's absoloutly needed...
nono, when you are talking about large scale power production there is a huge bunch of environmentally friendly and renewable/limitless alternatives.
it's hard to make portable power sources, hydrocarbons and combustion engines were a stroke (heh) of luck in this respect. but where the power sourse is able to be limitlessly large and stationary you can get all sorts of payoffs - because it enables you to use so many methods of generating power that are un-suited to be made portable themselves.
solar power, wind power, renewable biomass combustion, geothermal power, fusion power, the list goes on and on. my favourite is "artificial volcano, 1 km high wind tunnel power". <-- i'm gonna travel to watch this one be built, it's freaking amazing.
-
- DBB Admiral
- Posts: 1557
- Joined: Sun Oct 07, 2001 2:01 am
- Location: Richmond,B. C., Canada
- Nitrofox125
- DBB Admiral
- Posts: 1848
- Joined: Sun Jul 07, 2002 2:01 am
- Location: Colorado Springs, CO, USA
- Contact:
As stated above, I think there will be a singularity, but it won't really make a difference, because there will be technology developed to help us with other technology, much like a WYSIWYG HTML editor. The WYSIWYG interprets the more complex HTML and converts it into a simple thing for us to use.
Also there may in the future soon be a way to "load" things into our brain as in the Matrix, so the singularity may be only a temporary thing.
Also there may in the future soon be a way to "load" things into our brain as in the Matrix, so the singularity may be only a temporary thing.
Thats doubtful. We still don't know enough about the human brain to be able to do something of that magnitude. It would also take away what makes us unique as individuals. Whats the point of learning something, spending time to be as good as you can be at something if someone else can just load the information into their brain and be able to do it on the spot.
Art, music, everything, everyone would be able to do everything with the same amount of skill. Perhaps not quite the same stuff would result from it, but it could certainly be copied by everyone else. Just wouldn't fly IMO. It would take away the very essence of what is great about being human.
Also, what if the human brain can't handle forced instructions? What if there is a reason we learn through hands on experience and time. Forcing the brain to learn something could have the potential to be catastrophic to an individuals fragile mind. The brain might not want to accept the information and the experience could result in a patient being put into a coma. Memories as well as practical information might be lost in the attempt to make room to store and comprehend the deluge of incoming data.
I'm not saying that science fiction can't become science fact, it happens almost every day anymore but seriously...some of you guys need to spend a little less time watching movies and tv. Apparently it really does rot your brain
Art, music, everything, everyone would be able to do everything with the same amount of skill. Perhaps not quite the same stuff would result from it, but it could certainly be copied by everyone else. Just wouldn't fly IMO. It would take away the very essence of what is great about being human.
Also, what if the human brain can't handle forced instructions? What if there is a reason we learn through hands on experience and time. Forcing the brain to learn something could have the potential to be catastrophic to an individuals fragile mind. The brain might not want to accept the information and the experience could result in a patient being put into a coma. Memories as well as practical information might be lost in the attempt to make room to store and comprehend the deluge of incoming data.
I'm not saying that science fiction can't become science fact, it happens almost every day anymore but seriously...some of you guys need to spend a little less time watching movies and tv. Apparently it really does rot your brain
of course , the video you mention is from an episode of beyond 2000.Ford Prefect wrote:Have you watched the video linked on the site Roid? There is an operating Solar Tower in Spain. Stupid video is quite glitchy for me so I haven't been able to watch it all. Looks very interesting but I haven't been able to tell how much power they get per tower yet.
the spain powerplant the video covers produced 50MW and is 200meters tall. i have a picture of it inbedded in my "solar tower and railgun" thread. and here is a link to some information on it http://www.sbp.de/en/html/projects/detail.html?id=82
the new tower however will produce 200MW and is 1000meters tall.
Mobi, you'd write excellent sci-fi books but you actually seem to pretend to predict the future here.. Oh well.
I don't care about the future. I don't believe things will escalate so fast either, you have been incubating these ideas because you picked them up in sci-fi movies/books/etc. Free to you to believe what you want though.
Depending on the way you look at it, the "singularity" as you call it, has already happened. Does anybody fully understand even the smallest bit of technology they're using? I don't think the peak of technology will ever surpass the peak of mankind. Averages are another matter though. It saddens me to see that the average Joe is much, much more stupid and less educated than he was 25 years ago. The danger lies in the fact that technology replaces certain tasks of the human mind and as such, render it lazy and stupid. Why bother to calculate 422*7 if your cellphone or your watch can do it for you?
Along with pure skill and knowledge, I see common sense dissapearing as well. Decency, humility, friendliness. It's not technology that will kill mankind, it's the lack of inter-human contact. In 50 years it's not technology I will be afraid of, it will be neurotic behaviour and social dysfunctioning. Mark my words.
I don't care about the future. I don't believe things will escalate so fast either, you have been incubating these ideas because you picked them up in sci-fi movies/books/etc. Free to you to believe what you want though.
Depending on the way you look at it, the "singularity" as you call it, has already happened. Does anybody fully understand even the smallest bit of technology they're using? I don't think the peak of technology will ever surpass the peak of mankind. Averages are another matter though. It saddens me to see that the average Joe is much, much more stupid and less educated than he was 25 years ago. The danger lies in the fact that technology replaces certain tasks of the human mind and as such, render it lazy and stupid. Why bother to calculate 422*7 if your cellphone or your watch can do it for you?
Along with pure skill and knowledge, I see common sense dissapearing as well. Decency, humility, friendliness. It's not technology that will kill mankind, it's the lack of inter-human contact. In 50 years it's not technology I will be afraid of, it will be neurotic behaviour and social dysfunctioning. Mark my words.
This will only to be true to a very small portion of the worlds population. Third world communities will not be overly affected by high tech. There will still be large number or occupations such as the construction industry where people have to work togeather (in person) to get a job completed. Most people do not want to live in a sterile lonely environment such as indicated in Tricords post. I know I don't.Tricord wrote: Along with pure skill and knowledge, I see common sense dissapearing as well. Decency, humility, friendliness. It's not technology that will kill mankind, it's the lack of inter-human contact. In 50 years it's not technology I will be afraid of, it will be neurotic behaviour and social dysfunctioning. Mark my words.
Why would inter-human contact reduce any further past the current point? Most jobs these days still require it, and as the world is more automated quite possibly even more might.
I still don't see signs of bricks-and-mortar schools disappearing either. In the future they may tend to, but whether inter-human contact there will... I actually doubt that because governments are running most of them and they of all people would want to ensure children picked up social skills.
Despite common perception, jobs in the IT field no longer allow people to work alone. Once upon a time that was the case, but after the complexity of IT increased to today's levels, a single man can't accomplish anything significant any more.
In design, teams are the way of the future. That's all there is to it. Not just for IT; any technology-based field requires it.
I still don't see signs of bricks-and-mortar schools disappearing either. In the future they may tend to, but whether inter-human contact there will... I actually doubt that because governments are running most of them and they of all people would want to ensure children picked up social skills.
Despite common perception, jobs in the IT field no longer allow people to work alone. Once upon a time that was the case, but after the complexity of IT increased to today's levels, a single man can't accomplish anything significant any more.
In design, teams are the way of the future. That's all there is to it. Not just for IT; any technology-based field requires it.
- Nitrofox125
- DBB Admiral
- Posts: 1848
- Joined: Sun Jul 07, 2002 2:01 am
- Location: Colorado Springs, CO, USA
- Contact: