The range of Ziff-Davis/EWeek topic centers is certainly comprehensive, and each features news, reviews, opinions, and analysis, in the following categories:
Database http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3850-1
Desktop http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3853-1
Developer & Web Services http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3856-1
Enterprise Applications http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3859-1
Linux and Open Source http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3862-1
Macintosh http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3865-1
Messaging & Collaboration http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3868-1
Mobile Devices http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3871-1
Networking http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3874-1
Security http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3877-1
Storage http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3880-1
Windows http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3883-1
Wireless http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-95-1-1-618817-3886-1
With this cornucopia, there should be little reason for an assignment or a research paper to lack content.
http://www.pcmag.com/article2/0,4149,1382914,00.asp
The ever-spicy John Dvorak opines that blogs are already fading into the woodwork, citing the number of published blogs which are defunct, and suggesting that co-option by big media will doom blogs as an independent voice. One always warms to Dvorak, even when he is wrong, and I think he is here, for at least two reasons:
1) Filter blogs, like the one you are reading right now, are a useful resource, even if their readership is narrow and there isn't much feedback [Doesn't anyone out there like me?]; and
2) There are some famous and independent blogs which qualify as works of art in their own right, and the dedication their authors bring to them argue strongly for their continued survival.
I am not alone in my dissent, as the following articles indicate:
http://www.eweek.com/article2/0,4149,1393341,00.asp
http://www.extremetech.com/article2/0,3973,1394279,00.asp
http://www.eweek.com/article2/0,4149,1384450,00.asp
Article on how it is still easy to hijack someone else's domain name, made most interesting by the wall of secrecy which the principals throw up, which is not exactly comforting in regards to their accountability. Also indicates how aged some parts of the InterNet structure are, and a useful account of some of the real problems with DNS.
http://eastbay.bizjournals.com/eastbay/stories/2003/11/24/story1.html
Article discussing the problem left-over networking cable presents for landlords and tenants alike. There are millions of miles of unused cable in buildings, a situation made worse by the fact that higher networkspeeds and new devices make recabling a necessity. Pulling cable is sufficiently difficult and expensive that it is rarely done.
This is one of those 'everyday bits' of information which are useful to raise for discussion in IT classrooms.
http://www.washingtonpost.com/wp-dyn/articles/A8730-2003Nov23.html
Article discussing the degree to which intellectual works displayed on the InterNet [particularly the WWW] are ephemeral, particularly those which have a Web-only presence. Since the mean lifespan of a Web page is 100 days, we risk serious losses in our institutional collective memory. While librarians have suggested one solution in the form of a permanent URL, this has not caught on. Web archives like the "wayback machine" catalogue 20 TB of data monthly, and just barely keep up; other solutions impose additional complexity.
Of course, there is a lot of evidence that forgetting is beneficial for a culture; perhaps it is for individuals also. For a dark fictional look at some offshoots of InterNet information dependency, real Killing Time, by Caleb Carr.
http://www.computeruser.com/articles/2211,5,16,1,1101,03.html
Article discusing the personal effects of offshoring, suggesting that people harness their anger about this constructively. The case for offshoring is not always as ironclad as it seems: by promoting their own capabilities and backing this up with numbers, IT workers do have a chance of convincing management that their jobs should not be sent overseas.
It may not be a position IT folk find comfortable, but it does beat the alternative.
http://www.clickz.com/feedback/buzz/article.php/3112021
Insightful article on "The 10 Biggest Spam Myths", from which I quote directly:
1. There are only 200-300 hardcore spammers worldwide. They account for the overwhelming majority of junk e-mail.
2. Most spam comes from outside the U.S.
3. Spam legislation can end the problem.
4. [Spam can be defined.]
5. Legitimate marketers don't spam.
6. Opt-in is a sufficient spam deterrent.
7. Never opt out.
8. Microsoft is committed to helping end the spam epidemic.
9. A do-not-e-mail database will stop you from getting spam.
10. Spam can take down the whole Internet.
Since many of these points go contrary to what 'everyone thinks', they are worth considering and investigating. We cannot control the problem [and it is a problem with potential to estrange many from the InterNet] without thinking about it clearly and accurately.
http://www.businessweek.com/technology/content/nov2003/tc20031119_9737.htm
Article suggesting that Linux's competitive position may be much stronger than financial measures would suggest. The reason: Linux is developed in a decentralized way, which helps maximize the scope and speed of innovation. In making this argument, the article is in line with some received wisdom about complex systems operations, so it certainly serves as food for thought.
http://www.eweek.com/article2/0,4149,1390273,00.asp
The bloom may be off the Internet rose, according to this article, if something is not done to correct abuses. There may be something in this: initially the resistance to using the Net in its widest sense was the learning curve. While graphic interfaces and helper applications have reduced this, they have not eliminated it -- put another way, there are some inbuilt barriers to using the Net, even if it functions perfectly. But when people see it as a source of problems rather than solutions [even if it is just the experience which is being affected -- something I was powerfully reminded of last night when I closed my 4th pop-up ad, and thought "Man, this is getting a bit much; the enjoyment:effort ratio is starting to become unfavourable. If the sponsors doing the popups could only see how annoying they are, to the point I will voluntarily boycott a product/service which is not site-connected, they might cease and desist"] -- then they will stay away in droves.
While this has negative consequences, it might not be entirely bad -- a reduced-scale InterNet might be less commercial, might have less congestion, and would be less of a temptation to malware/scumware writers, scam artists, and advertisers who apparently need to be taught social responsibility through the application of creative disfigurement.
Of course it could all be corrected, but I don't see much hope of that. The evolution of parasitic activity on the Net parallels the equally tenacious hold of parasites (like politicians) in the rest of society, for a simple reason: the strategy works for the parasite, and if there is no easy counter, will continue to do so.
Another article which looks at this set of problems from a quite different perspective, in relation to connectivity threats, can be found here:
http://www.infoworld.com/article/03/11/21/46FEtrouble_1.html
Others, who think that we are best served by scrapping the existing InterNet structure and starting all over again are represented by the following articles:
http://eletters1.ziffdavis.com/cgi-bin10/DM/y/eUgd0CyMye0HX60v1d0AN
http://cl.com.com/Click?q=58-mr3xIdJfwQnk9tNirDIjtb0CM9RR
http://networking.ittoolbox.com/news/dispnews.asp?i=105191
Article pointing out that as Microsoft products become incrasingly expensive and obtrusive in the application of user verification measures, Linux looks more attractive as an OS alternative. The big block to this is that people want to use the same software at home as they do at the office [and since this represents a valuable form of free self-improvement, as an office manager I would go out of my way to encoursge this], and Linux has not made a big corporate desktop entry yet.
One possibility would be the development of a critical mass of home users who could demand Linux in the workplace and have the numbers to back them up. One of the major attractions to Windows in the past was its lack of copy protection which made home users [most of whom, let's face it, are not going to pony up $500 for a copy of OFFICE] able to use the same software at home and in the office. If this conncection is broken, it puts Microsoft's corporate position at considerable risk.
Which suggests, more than ever, that Microsoft's draconian pursuit of every possible purchase may be ultimately self-defeating.
http://www.guardian.co.uk/online/story/0,3605,1078077,00.html
Recounts the research of a professor in the MIT Media Lab, which suggests that we will, in the future, increasingly internalize the network. With pervasive connections in the real world, we get an augmented instead of a virtual reality. This is a prospect with problems as well as promise, but it does suggest a nuanced and convincing view of the future which could be used to underpin future curriculum planning for IT schools.
http://news.com.com/2100-7342_3-5103519.html
Summarizes a Robert Half report to the effect that USA IT starting salaries will fall nearly 2% in 2004. Unemployment for computer professionals is at its highest since data started being collected in 1982. On the other hand [no surprise here] the salaries for IT managers will increase, as will those in 'hot' IT fields.
The problem for educators in the practical IT field is that the 'hot' fields change so rapidly that an extended educational programme cannot cope. If the general prospect is so dismal, students will cease enrolling, programmes will close, and the law of supply and demand will work its inevitable effects.
Another viewpoint on this issue, encapsulating the debate between those claiming that a new technical worker shortage is inevitable and those saying 'outshoring' will permanently reduce the IT workforce requirement can be found here:
A pair of opposed opinions on what the problem is and what should be done about it can be found here:
http://www.nwfusion.com/columnists/2003/1103biggs.html
This is obviously becoming a flashpoint as we approach the end of 2003, since articles on the topic are both multiplying and expressing a diversity of viewpoints, as exemplified by this:
http://www.siliconvalley.com/mld/siliconvalley/news/local/7225976.htm
http://www.techweb.com/wire/story/TWB20031105S0011
The Virginia Tech supercomputer based on 1,100 Apple MacIntosh G5 linked systems, mentioned previously, bids fair to become the fourth most powerful supercomputer in the world, at a cost of 2% of the world's fastest [which can handle about twice the processing load]. What is interesting in this article is that the university plans to release a "kit" so others can produce their own supercomputer.
With a sticker price of some $5.2 million, you should start saving your pennies now....
http://www.siliconvalley.com/mld/siliconvalley/7298279.htm
Article reporting another industry trade group employment surves suggesting that over 200,000 IT jobs will be cut by the time 2003 ends, with 13% of technical jobs vanishing in the past two years. The only whilstles in the wind is the fact that the llayoff pace is slowing, and that venture capital may start flowing to small companies again, fuelling some growth.
URL indexes a single floppy distribution Linux firewall designed for Ethernet connections to the Internet (cable or xDSL), allowing connection sharing. This would be a good example of a class project which students could use for home purposes as well, providing extra motivation.
http://www.foundstone.com/resources/freetools.htm
While security tools are useful adjuncts to classroom teaching, their cost can be prohibitive. Here is a link indexing a page of useful security tools for assessment, forensics, intrusion detection, scanning, and stress testing. When the the menu sections are accessed, a page with a short description of each of these utilities displays.
The cost is hard to beat, since they are free.
http://www.searchengineshowdown.com/features/
Search engines have become a standard part of anyone's Web surfing experience. There are lots from which to choose. The link enables the distinctive features of dozens of search engines to be compared, allowing all and sundry to see what's what.
http://web.mit.edu/~simsong/www/ugh.pdf
Links to a downloadable .PDF file containing The UNIX Hater's Handbook which exposes the seamy underbelly of the UNIX (and by extension Linux) OS environment. While intended to be humourous, there were times when I found myslef nodding in agreement, particularly when the authors savage the ungainly C syntax.
http://www.newsfactor.com/perl/story/21431.html
An article looking at a range of myths concerning Linux on the desktop, suggesting that cost advantages of the OS alternative will be limited at best, because of the cost of support. Commentary and related links are provided with this story.
Another article on Open Source [which is not coterminous with Linux, but actually is its superset] myths, is avalailable here:
http://www.cio.com/archive/030104/open.html
along with sidebars and links, makes the following points:
* Low price tag is not the chief attractor -- functionality is.
* Although migration imposes costs, ultimately savings are real.
* While there is application support, it is more diffuse than in the closed-source world.
* There are no major legal barriers to Open Source adoption, despite the SCO brouhaha.
* Open Source can support mission-critical applications, and is doing so, for example, in the banking industry.
* Open Source is ready for the desktop.
The bottom line is that Open Source software makes business sense, and deserves consideration. Coming from an authoritative neutral source like CIO.com, this is a significant statement of the case.
http://whitepapers.comdex.com/data/detail?id=1051115002_827&type=RES&src=KA_RES_QUERY
Internetworking is in many ways the most challenging aspect of networking, but with the advent of our wired world, even small and medium-sized organizations may find it necessary. This white paper: "The Basics of Internetworking" extols the value of simplicity in solution deployment and management
http://www.newsfactor.com/perl/story/21300.html
The MCSE certification is certainly the most popular, with holders numbering in the hundreds of thousands. The real question -- are all these people really necessary? This article concludes that there is a glut of MCSEs over all, though those with Server 2003 and security certrifications may well stand out from the crowd.
Article describing weblogs, and their implications (largely negative) for Webmasters. Since the blog puts content publication under individual control, there is no need for a Webmaster. This article contains an extensive list of references to other articles on the subject, and a brisk, invigorating discussion.
http://www.masternewmedia.org/2003/10/12/microsoft_ready_to_achieve_lockin.htm
There are some issues which are sufficiently complex and yet so pregnant with import that even if one cannot understand them in detail and scale, some sort of reaction is called for, mostly along the lines of "do no harm". I have recorded my unease about the concepts behind Microsoft's "Trusted Computing" in previous articles, and I still have grave doubts about this initiative.
This article links to an-online report on the risks involved in this development, with a wealth of internal links and additional references to other articles on this subject. This is one of these forks in the road which, taken blithely, can come back to stab us in some tender parts indeed.
A rather glowing description of the power and additional security capabilities offered by the new Apple Macintosh OS X version called "Panther". There is no doubt this is an impressive OS release, although there is some doubt that continually sticking out its metaphorical chest of virus immunity is a good thing -- I would think that some of the top virus writers would consider this a bit of a thrown gauntlet.
The OS is attractive, the proprietary nature of the hardware is not [although I do concede it also has advantages in terms of simplicity for users]. If some really in-depth true ROI costing were to be done here, the results would be most interesting for both organizational and individual users, although the practical barriers to producing such an analysis in a vendor-nneutral and credible manner are so immense as to virtually preclude it.
A number of links to sources, resources, and additional articles are also provided.
http://www.llrx.com/features/rss.htm
A useful article with links to resources and additional discussion on RSS, the (perhaps overhyped but still) useful adjunct to Web site content distribition. One priceless line worth quoting: "RSS is also something that once you have read its description, you know less about it than you did before." The positive and negative aspects of this technology are clearly discussed, as is its future potential.
Additional thoughts from the same author on the future of RSS, with an extensive list of additional links, can be found here:
http://www.masternewmedia.org/2003/10/02/the_future_of_rss.htm
http://www.masternewmedia.org/2003/10/26/contextual_online_collaboration_tools.htm
A repeated theme in networking is the value of the online collaboration which it enables, and many prognostications of how this will revolutionize the workplace have been on offer. The reality has been somewhat less than overwhelming. While I still consider collaborative tools to have great potential, I have come to realize that technical and organizational barriers continue to prevent them from receiving widespread everyday use. Not least is the fact (inclined to the positive though I may be) that we don't have a convincing demonstration of competitive advantage resulting from implementing these tools.
What is blocked at the door sometimes comes down the chimney, as this article [which has an abundance of direct links and supporting information] suggests. Instead of using explicit tools, implementing collaboration through extension of existing tools in a way which makes it simply 'another menu item' has potential for success which escape more 'architectural' solutions. Were this to be so, it would be another demonstration of how appropriate is the 'bottoms up' model for network activity.
Indexes an online service providing firewall tests, allowing vulnerabilities to be probed safely, and also offering explanations and links for specific vulnerabilities. This would be an effective tool to deploy as a 'before and after' test instrument for a security class.
A site article which is continually updated [available as a downloadable .PDF file as well] comparing Windows XP against Apple's OS/X. While the author admits that the criteria weights are not pellucidly watertight, they do give readers some insight into what works with each operating system, and what does not. The overall score puts OS/X into a solid lead.
Now if the hardware was not so blamed expensive....
http://certcities.com/editorial/features/story.asp?EditorialsID=64
In the Microsoft or Cisco world, it is straightforward to specify certification types and the paths among them. However, the Linux world is somewhat more complex -- this article profiles the various certification options available for Linux.
http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2003/11/13/MNGPU30H811.DTL
A California state appelate court has ruled that InterNet Service Providers can be held legally responsible if someone using their services defames someone online and the ISP knew about it. This makes an ISP equivalent to a bookstore or a library, instead of a telephone or telegraph network. The chilling effects of this misguided ruling on free InterNet speech cannot be exaggerated, and serves as yet another example of how the legal system continues to vitiate the promise inherent in information technology.
Of course, the ruling is being appealed...we can comfort ourselves that the lawyers at least, are making money out of this.
Article which notes that while the trends to offshore outsourcing are solid and will continue to accelerate, there will always be a set of locally-needed skills. In addition to security, product development, professional services, and customer services represent areas of job growth in IT. System administrators, in contrast, are being pushed out by the management capacities of multi-tenant system farms. Computer science is no longer considered "the place to be".
http://www.sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2003/11/03/BUGD42O8E41.DTL
The problems with attacks on the InterNet are becoming sufficiently severe to justify a multi-million research grant to a California team of universities to create an accurate model of the Net. By using a sufficiently complex test bed, the consequences of hostile action can be determined so policymakers can react successfully and network designers can improve infrastructure security, locking up the barn door before the horse is stolen.
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-87-1-1-618817-3574-1
Security is not quite like the weather: not everybody talks about it, and somebody does something about it. Identifying problems is all well and good, but this article goes further and reviews remedies which operate at the personal security level.
http://news.com.com/2100-1032_3-5106129.html
Following strong and concentrated protest, the USA's Patent and Trademark Office has agreed to re-examine the Eolas patent which effectively broke Web browsers (see previous entry) and was agreed to be both voided by prior art and a humongously bad social decision by anyone with technical competence. The article also indexes related stories on this issue.
I would expect the patent to be reversed on careful re-examination, so we may see here an uncommon victory for common sense -- and of course, another lesson in what happens when Microsoft's ox is gored.
http://whitepapers.comdex.com/data/detail?id=1053093882_459&type=RES&src=KA_RES
From the same folks minuted previously, comes a white paper: "Better Security - A Practical Guide - Network Security for Large Enterprises" that outlines planning for larger organizations. Certainly a useful reference for teaching about enterprise-scale networking security.
PUBLISHER: WatchGuard Technologies, Inc.
http://www.writeronline.us/guest/schuett-11-3-03.htm
An interesting and clarifying discussion of weblogs from the writer's point of view, making the point that a blog is the easiest way to get online [though unlike the tone of the article, I don't think the traditional Web site is going to become a complete dinosaur any time in the near future]. Also contains links to other sources of infromation about blogging.
http://www.digitaldeliverance.com/MT/archives/000281.html
Amid all the hoopla for RSS as a delivery system, here is a negative view which suggests it will not be all that useful to most people, who are not inclined to raise their technological heads above the trough of e-mail. I have been somewhat skeptical of those RSS proponents who regard it as replacing e-mail; the fact that current RSS installation require more of a software updating on the desktop than I care to contemplate at this moment suggests that even people who are technologically aware and conscious of the technology's benefits may not jump on this bandwagon with NICs unholstered.
RSS is, nevertheless, a new form of InterNet communications which enables information gathering in different ways, shifting the activity burden from accessing to using. The consequences of this ought to be major, if the technology really can deliver on its promises; if not, then never mind!
http://entmag.com/reports/article.asp?EditorialsID=53
With Windows Server 2003 now upon us, vendors of enterprise management software for servers have been busily rewriting their offerings. This article covers the state of play for the various classes of software under development. Server add-on software is something which is important in many real-life deployments, but which gets neglected in both certification studies and coverage of applied IT.
http://www.line56.com/articles/default.asp?ArticleID=5141
First of a two-part article on the opportunities and dangers relevant to Russia as a source of IT offshore outsourcing. Russia is not like India, and offers a specific talent pool which may not be replicated anywhere else outside North America and Europe. On the other hand, the chaotic state of business relations in Russia certainly introduces risk elements.
The bottom line: in IT you compete with the whole world.
http://security.itworld.com/nl/security_strat/11112003/
Even a relatively small network (one with, say, 20 hosts) which is connected to the InterNet can benefit from the security staff (or person) having a security laboratory. This need not be elaborate -- some obsolete hosts and connecting gear -- but can be an invaluable tool for testing solutions, as this article makes clear.
More on security testing labs [including cost issues] from:
http://www.geekspeed.net/~beetle/download/attacklab.html
http://www.giac.org/practical/GSEC/Greene_Paul_GSEC.pdf
This information has obvious application in just about any educational environment where information technology security is being taught.
http://zdnet.com.com/2100-1104_2-5103314.html
Extensive article suggesting that with the 'Longhorn' operating system build, Microsoft is moving away from general and open standards back into proprietary standards. The many improvements in the computing experience which Longhorn deliveres are also specifically designed for it, creating user lock-in.
There may be resistance to lock-in, and I personally hope there would be, but past events give no reason for looking at this development cheerfully -- unless you hold wads of Microsoft stock.
http://whitepapers.comdex.com/data/detail?id=1060957258_977&type=RES&src=KA_RES
One of the major knocks against mobile devices has been the security problems they bring in their train. This paper: "Wi-Fi* Protected Access and Intel entrino(TM) Mobile Technology Deliver a Robust Foundation for Wireless Security" may need accompanyment with several grains of salt, given that it is by Intel, but it does suggest that upgraded standards are now available for WiFi security.
Another paper, from the same source, covers "Deploying Secure Wireless Networks: Intel's Strategies to Minimize WLAN Risk", and is available here:
http://whitepapers.comdex.com/data/detail?id=1067873911_604&type=RES&src=KA_RES
More on wireless security in the paper: "Seven Security Problems of 802.11 Wireless" which can be found at: http://whitepapers.comdex.com/data/detail?id=1067873924_916&type=RES&src=KA_RES
Yet more on wireless security, with a proprietary solution offered, from the paper: "Cresting the Wireless Wave with Security Solutions - Solutions to the WLAN Security Crisis" from:
http://whitepapers.comdex.com/data/detail?id=1067873917_615&type=RES&src=KA_RES
http://whitepapers.comdex.com/data/detail?id=1067434029_925&type=RES&src=KA_RES
The continuing press for network speed is inherent in this white paper: "10 Gigabit Ethernet Technology Overview". The major problem we face with this on the LAN front is the condition of our building backbone, which may be designed in such a way that upgrading is difficult. At some point, one has to think that we will have 'enough' speed -- say, to produce HDTV on-demand -- but we are not there yet.
http://www.technewsworld.com/perl/story/31992.html
Two of the major corporate entities of the IT world, Microsoft and IBM, have radically different ideas of how the Internet and distributed computing are going to evolve. The fact that the former is dedicated to proprietary standards and the latter leans to open-source suggests, according to this article, the extreme degree of danger to Microsoft's future if the trend for good open source standards to drive out good proprietary standards continues.
http://entmag.com/news/article.asp?EditorialsID=6021
Business Intelligence is another formerly arcane process which automation and deskilling are now causing to be devolved throughout the workplace. In one sense, this is a vindication of the flattened hierarchy and theories of decentralized or 'bottom up' control. In fact, some BI operations have been automated entirely.
This is an excellent example of the importance of 'information' in IT -- the sort of technological upgrading which can result in greatly improved productivity. But as suggested previously in several entries in this blog, such improvement does not happen automatically -- the tools must be applied with vision. An example of a real problem with BI at Cisco, and an analysis of the structural stresses which this development is causing IT operations sound some useful cautionary notes.
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-85-1-1-618817-3487-1
A 'special report' on the 'Panther' version of Apples OS/X, covering developer issues, its potential for enterprise use, patches, security, and the overall quality of this release [which is regarded as being high].
http://www.businessweek.com/technology/content/nov2003/tc2003114_2291_tc134.htm
The 'beige box' is now some two decades old as a form factor, and with the exception of laptops, has retained its same essential form [although having considerably more power]. This artilcle looks at the future of distributed computing, and suggests we will have a multitude of form factors:
1) The cooling limitations of current chips will be overcome with different cooling media technologies, shrinking CPU size;
2) Specialized computers will proliferate, and often not even be recognized as such;
3) Increased power, capacity, and reliability of networks will enable a wide variety of mobile computing tools, since consumers will be able to get a fast signal from anywhere -- so a terminal will become equivalent to a PC;
4) Flexible displays will enable portability, and wireless connection can mean that public flat panels can be accessed by any system.
It took several decades for our current computing environment to evolve and it will probably take about the same amount of time for replacement technologies to become pervasive.
http://www.cioinsight.com/article2/0,3959,1309523,00.asp
Article suggesting that IT has become commoditized, making for a poor long-term employment picture. According to an ITA survey, some half-million jobs have been lost in the USA since the big boom busted, with little hope of most of them returning. Some draw comparisons with what happened to telephone operators when automated switches were installed.
The changing nature of the IT workforce means that the only skill with real growth potential is the "IT Plumber" -- the tradesman who comes in and repairs an automated system when it breaks down. Anyone who has paid for the services of a plumber in the recent past will appreciate how lucrative such a trade could be.
The continuing economic impact of offshore outsourcing means that USA IT workers will have to retrain or find other occupations -- some 2 million additional jobs can be outsourced in the next few years. The overall picture for software development is a transition from a "master craftsman" model to an "assembly line" model.
Additional references are provided at the end of this article.
http://technewsworld.com/perl/story/32028.html
As a general rule, a given technology reaches perfection just as it becomes obsolete -- this article suggests that the humble computer mouse is in exactly that position. Some form of replacement looms over the 3 - 5 year horizon, whether it been improved speech recognition [think 'Scottie' picking up the mouse and saying "Computer!"] or a more versatile [and potentially hard-to-learn-to-use] physical input device that we hold or wear.
Certainly the major block to most people's ability to use computer technology is the input function, so major advances here could have major consequences.
http://abcnews.go.com/sections/SciTech/FutureTech/RSSWeb031029-1.html
One sign that the technology known as RSS has reached the mainstream is that the acronym is given different interpretations [Rich Site Summary, Really Simple Syndication]. This article explains the basics of RSS and what problems still remain. For my 2 cents, the fact that you have to install the .NET framework before being able to make an on-desk aggregator work is the main stumbling block.
If I can't do it in 5 minutes, it won't get done -- this is my latest computing mantra, and RSS still has to succeed on that account. But if your life involves tapping into news/information services and making sense of them, an agregator is an essential tool. One could concieve of an organization which aggregated at different levels, a form of meta-aggregation, so to speak, which might help people see the InterNet forest in that profusion of trees.
http://www.itworld.com/Tech/2987/031029datadouble/index.html
I tend to be somewhat bullish about IT, but I confess this article surprised me: the amount of stored data has doubled in the past three years, according to a UCB report -- the total in 2002 was 5 exabytes [or 500,000 Libraries of Congress], most stored on magnetic devices.
No wonder that petabyte storage for small firms and well-heeled individuals is becoming a reality -- data growth at that level makes for a quantity/quality shift. With a doubling rate of every three years, the idea of anyone ever having "universal knowledge" becomes risible.
One dark site -- magnetic storage is vulnerable -- a large and tempting target for those who would create massive social disruption.
http://entmag.com/news/article.asp?EditorialsID=6017
Scapegoating is a common organizational practice, and comes out in full form when a major virus incident happens. This article suggests that attempts to place blame are ultimately counterproductive, so this human impulse should be resisted.
http://entmag.com/news/article.asp?EditorialsID=6018
As viruses continue to escalate in complexity and destructiveness through using InterNet propagation methods, the protection model is evolving to fit. The concept of layered defense, which provides protection at the server and gateway level as well as at the desktop, has become prominent. In networking terms, this means more devices will have to be configured and monitored, which becomes an issue for both design and maintenance.
http://www.newscientist.com/opinion/opinterview.jsp;jsessionid=NEMPGLINCCMF?id=ns24191
The crisis in scientific publishing has been ongoing for so long it looks like normality. While this may seem out of topic for a blog on applied IT networking education, the issues involved here relate to intellectual property and the most effective distribution of ideas. The latter is something at which networks are particularly good -- the article interviews a major US government scientist who has implemented 'opn source' science publication, upstaging the epensive and slow-to-publish scientific journal system.
Since most scientists want to publish information and read that from others, the intellectual property issues here are much less pressing. Open source publishing may have just as much impact on scientific journal publishing as open source software development has on proprietary systems.
http://www.potaroo.net/ispcolumn/2003-07-v4-address-lifetime/ale.html
A convincing article with a lot of convincing statistics and equally convincing graphs, suggesting that we are not going to run out of IPv4 addresses Real Soon Now. In fact, we have about a decade of buffer space at least, meaning that we can slip into IPv6 slowly and deliberately.
http://www.informationweek.com/story/showArticle.jhtml?articleID=15800263
Although CD-ROM media have been touted as having lifespans in decades, as someone trained in librarianship, I have been a mite skeptical. Over the past year, some rather alarming evidence of CD fragility has come to light, and this issue of the "Langa Letter" [a listserv well worth paying for in its 'Premium Edition', incidentally] discusses the problem in detail with a number of informative links.
Once conclusion which dovetails strongly with my basic take on the situation is that some CD labels are destructive, though to my surprise, permanent markers, which I have always considered a sort of CD poison, are entirely OK.
Given the educational use of CDs for archival information, and especially the trend to move data from old [yet still working] magnetic media to CDs, this is a practical issue which needs careful examination, and the referenced URL is an excellent place to start.
http://techupdate.zdnet.com/Gartner_predicts_future_of_IT.html?tag=zdannounce0.list
In many ways, the URL says it all: Gartner predicts that in 2004 cost-cutting will be gradually replaced by innovation for growth, leading to an upswing in 2005 and a major surge in 2006. The next generation of computing will be built on a service software architecture based on always-on communication. "The next wave of technology [is] the confluence of pervasive wireless, real-time infrastructure, service-oriented architecture and low power-consumption mobile devices...".
If this vision is even approximately correct, we should be thinking about its educational implications now, so our 2006 curriculum reflects real applied needs.
http://www.newsforge.com/os/03/10/30/0537250.shtml
Article on the 'Panther' upgrade of the Apple OS/X, describing its features and comparing it with Linux. Appended commentary gives some other information and opinion about this latest OS upgrade.
http://whitepapers.comdex.com/data/detail?id=1052406549_289&type=RES&src=KA_RES
Wireless technology is not only becoming a major aspect of networking, but it is also a complex specialty with a language all its own. The white paper: "The CIO's Guide to Mobile Wireless Computing" provides an easily comprehended overview of the major elements and issues involved in this type of networking.
http://www.corante.com/policy/redir/32260.html
The oppressive nature of the Digital Millenium Copyright Act, surely one of the most obnoxious national laws since the Volstead Act, has been somewhat mitigated by the USA Library of Congress granting exemptions in four cases:
* bypassing digital content protection for InterNet filtering software;
* allowing access to programs with broken or obsolete dongles;
* programs/games using obsolete fomats/hardware; and
* e-books which prevent handicapped access.
Certainly these exemptions are both reasonable and supportable [and the fact that they are needed in the first place is ample condemnation of the DCMA]. We still do not have the degree of 'fair use' we previously had, and we could expect the Library of Congress to do more, but it is the library of Congress after all...
http://www.nytimes.com/2003/10/29/technology/29soft.html
The Eolas patent case has attracted sufficiently pungent commentary already in these pages, but the inervention of the WorldWideWeb Consortium on Microsoft's side is indeed an example of strange bedfellows. As the W3C submission makes clear, not only was the patent application invalid on the basis of 'prior art', but enforcing it will also cause economic harm throughout the Web.
One has to think that causing general harm in the support of mistaken policy decisions is simply another of these delightful surprises government constantly brings us. One also has to wonder when enough becomes enough.
http://news.com.com/2100-7345_3-5096702.html
An article describing how IT, in the form of improved Web services, is being used in education rather than as a subject of it. Microsoft has given a grant to MIT to research campus quality of life improvement through technology. Here is a case of clever giving, since the results can help Microsoft clarify its focus while establishing itself as an educational benefactor.
And of course the hope must be there that creating an appetite for Web services will improve the demand for Microsoft solutions in this area.
http://www.newsfactor.com/perl/story/22542.html
One of the many questions about Mocrosoft's next desktop version of Windows [the Longhorn project] is whether security will be enhanced to the degree the IT community feels appropriate. While Microsoft has had major problems with this issue, it has continued to make improvements. Yet to the extent that Longhorn adds new features, it also adds new risks. With a due date at least two years in the future, there will be, at least, some time to assess this carefully.
In addition to this article, links to some related articles on Longhorn and on security are provided.
More information on Longhorn can be found at:
http://news.com.com/2100-7345-5097537.html http://entmag.com/news/article.asp?EditorialsID=6012
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-82-1-1-618817-3367-1
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-82-1-1-618817-3373-1
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-82-1-1-618817-3376-1
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-82-1-1-618817-3376-1]
http://www.businessweek.com/technology/content/oct2003/tc20031027_9655_tc119.htm
The issue of offshore outsourcing has been ventilated sufficiently in previous posts -- this one points out that the cost savings are not necessarily all they cracked up to be. While the direct labour costs are in fact much lower offshore, if the quality of product delivered is so low that it requires repairs, then the cost benefits are lowered or eliminated. Firms exist making good money repairing buggy offshore projects.
One would think, at some level of overall cost which would not necessarily equal unity that the second-order effects of offshore offsourcing would also get weighed in the balance -- alas, I suspect, only in some better world than this one.