http://www.infoworld.com/infoworld/article/04/01/16/03FEgrids_1.html
After several years of 'more of the same' in PC developments, a cluster of technologies bids fair to bring us a new class of machines which combine speed and a small space footprint. Many of the individual developments summarized in this article are referenced elsewhere in this blog.
The major determining factor here is economic -- the degree to which existing machinery is still capable of doing the job. Since older machines typically have a higher maintenance cost, resolving this conundrum is by no means simple.
But if customers want to buy, manufactuers will have some attractive devices on display.
http://www.siliconvalley.com/mld/siliconvalley/business/columnists/7793099.htm
Every once in a while something comes along which has two contradictory properties:
1) I think it is the greatest thing since individually tubed cigars; and
2) It leaves me completely cold insofar as actually using it is concerned.
This article discusses an example [the Wikipedia] which ought to attract me like a moth to a flame, and also covers wikis [on line collaborative authoring tools] in general. These tools seem to bring both power and transparency to the knowledge display function. It all seems highly useful, and it definitely is generating a major user base.
And I can hardly summon even a flicker of interest -- I wonder why?
http://www.wired.com/news/infostructure/0,1377,62016,00.html
Effective mimicry is often a key element in the Darwinian struggle, which gaves more point to this LinuxWorld report indicating that Linux is steadily moving to the desktop, gaining a Windows look-and-feel as it does so. Related stories are also listed on this site.
I have consistently advocated that anyone bringing out a Linux distro which is as easy to install and use as Windows [and achieving that goal is not hard] could corner the home market in an extremely short time, if marketed correctly. Implemening a GUI which will work for the Windows-aware crowd is one important key to this strategy.
http://www.internetnews.com/dev-news/article.php/3302121
Brief article discussing the thesis that companies are stockpiling patents as a way of extracting licensing fees. Even if the patent is contestable, it may be cheaper to pay the fee than to fight the case in court. The long term implications of this for software developers are extremely negative.
More to the point, the stockpiling of patents as a litigation weapon seems a perverse frustration of the original intent of patenting, which was to increase the spread of useful knowledge.
I am firmly on the side of those who consider the current state of software patenting to be broken -- I am less sanguine that it ever can be repaired.
http://cl.com.com/Click?q=c6-URl9QMLMgFclu7mdIX6zfm9shRRR
It is worth reminding ourselves that all the whirring fans and blinking lights are not ends in themselves. This white paper: "A Vision for Business Intelligence" attempts to show how we can use a powerful technology which has been under appreciated and not used to its fullest. This is an example of a broader problem with technology and innovation implementation within organizations, a general solution to which would be a most powerful empowering tool.
http://cl.com.com/Click?q=fe-xkKgQwgDyMrGUcmkPi4jg6hhvRRR
Identifying the best communication solution, especially where WANs are concerned, is by no means straightforward. This white paper: "Frame Relay vs. IP VPNs" discusses the pros and cons of both approaches, and will be valuable in those teaching environments where a fully-configurable WAN is not ready to hand to serve as an experimental subject.
http://whitepapers.comdex.com/data/detail?id=1050602782_964&type=RES&src=KA_RES_QUERY
Your network can never be too fast, so a white paper on "Strategies for Optimizing LAN Performance" certainly is a useful thing to have on hand. It describes potential bottlenecks, how to identify them, and how to cure them.
http://www.businessweek.com/technology/content/jan2004/tc20040127_2819_tc047.htm
The RIAA has succeeded in putting its foot in its mouth before shooting itself in the foot. By its aggressive tactics it not only looks like a bully, but it also gives a powerful impetus to the use of strong encryption for all Netly transactions. Once that happens, the 'enforcers' will not even be able to find their victims without an extensive amount of legwork -- more than the game is worth.
And by forcing file sharers underground, the RIAA also makes it more difficult for others pothered about the implications of the digital era, like the MPAA, to detect transgressors.
http://webopedia.internet.com/quick_ref/BIOSBeepCodes.asp
When a computer most needs your help, it is close to helpless to tell you what is the matter -- if the screen will not display, there is no way you can read an error code. Like Morse Code, beep codes are not much used these days, because the boot process is so reliable, but if you ever need them, you need them bad. The above set is for an AMI BIOS; the following set is for the Phoenix BIOS: http://www.webopedia.com/quick_ref/BiosBeepCodes2.asp
http://www.businessweek.com/technology/content/jan2004/tc20040121_9640_tc139.htm
While RAM has widely been perceived as a major bottleneck in current PC architecture, it has been the 'only game in town'. The fact that RAM is slow is compounded by its need to be refreshed and its volatile nature. The article suggests alternatives which are in development and which have the potential to completely change PC architectures in the next 5 - 10 years.
A current of related evidence supports this estimate -- in the last couple of years we have moved to the 1GB level on a well-stocked desktop [and indeed, 256MB is getting close to the acceptable minimum on even low-end systems], and servers routinely press against the 4 GB limit of current 32-bit chips. Size, heat, and cost considerations all suggest that we won't simply go on adding DIMM chips to make this memory possible [though the recent announcement of 2GB modules certainly increases memory density to the point that few slots are needed to support 4GB RAM], making a new architecture more, rather than less, likely.
Add to this the overall need for system speed, particularly in graphically intense applications, and a new architeture looms ever more probable.
http://www.againsttcpa.com/tcpa-faq-en.html
I have commented on the "trusted computing" issue several times in this blog -- the indexed article discusses the ins and outs of this technology, and overall holds it to be a Bad Thing And A Bad King. As do I.
But we have to realize that some of the things which trusted computing is intended to implement are not in the least objectionable [I have, for example, no problem at all with Microsoft being able to enforce payment on all who use its products], and will, in fact be highly desirable to a small minority, who just happen to have access to the levers controlling the legislative system. It is equally true that some of the things which trusted computing could do are highly objectionable.
Then the question must be re-focussed: are the costs worth the benefits? In particular, would it be possible to control some of the objectionable aspects through the operation of standard commercial law? The problem with a positive answer here is that the technology of enforcement is sufficiently stealthy that it might be extremely difficult to detect and remedy non-compliance with such law.
Like many other things in IT and life in general, when examined closely, this does not turn out to be a simple topic at all, and we may get answers less by prescription in advance than by muddling through and working out what prove to be the inevitable consequences.
An engaging analytic article suggesting that over-concentration on Moore's Law may actually be harmful to technology planners and the industry as a whole. Noting that older processors are embedded by the billions, and we still have not gotten the full chicken richness out of most of these, the author suggests we are asking the wrong questions, and so are guaranteed to get the wrong answer.
The physics of the situation are outrunning the capacity of organizations to implement -- successful Google has gotten that way partly by looking at the 'price' end of the price:performance ratio, and not the 'performance' end. How many organizations can successfully emulate this feat is in question, because the fate of those who try to avoid, cope with, or just ignore Moore's Law seems to be equally dismal in the long run.
Of course, in the long run....
The continuing battles relating to copyright have already been recorded in this blog; the article indexed by this URL reviews the degree to which copyright has moved from a limited protector of individuals to a near-permanent protector of corporations. The question is discussed in some detail, with the inevitable negative consequences on creativity and art being clearly laid out.
This is another example where a large amount of people collectively suffer loss which is neither direct nor easily quantifiable, whereas the few who gain do so directly and in large measure. The inbalance which results may have long-term negative consequences even for the 'winners', but by the time this point is reached, there may be no return possible.
To the extent that copyright is now a global phenomenon, this is very grim news indeed.
http://www.opte.org/maps/tests/
We tell our students that the InterNet is the biggest network of networks -- but why not show them? The indexed URL leads to a site which is dedicated to mapping the Net, creating rather etherial pictures which have a distinct neurological quality. Because the colour coding used distinguishes by region, for example, these pictures can be a good starting point for discussing the degree to which the Net has an uneven inpact on the world.
http://www.pcworld.com/news/article/0,aid,114417,00.asp
The DVD Copy Control Association announced it was dropping a case against someone who posted information about DeCSS, which defeats DVD security technology. This is a body-blow, I think, to the overly-restrictive provisions of the DCMA which were contrary to requirements for free speech and open research. Similar cases elsewhere in the USA, as well as in Norway, have also resulted in these stranglehold provisions being struck down, making it harder for those who would control information with a fist of iron to succeed in their aspirations.
http://www.overclockersclub.com/reviews/amd_64_article.php
Here is an extensive article on the AMD Athlon-64 processors, including am explanation of their peculiar naming system. A side-by-side comparison of each chip type provides understanding of the critical differences between the processor types involved.
Additional information on Athlon-64 motherboards can be found in the following 46-page article:
http://www.hardwarezone.com/articles/view.php?cid=6&id=921
http://www.pbs.org/cringely/pulpit/pulpit20040122.html
In the continuing argument about offshoring, Robert X. Cringley opines that the government and business don't 'get it' -- that by allowing something which has been nationally profitable like IT to be moved away from the USA, a form of 'hollowing out' is at risk, because even in the USA, the available manpower is absolutely low.
Yet the author also gives plenty of examples of the USA and other countries shedding unprofitable industries for profitable ones. The key here is that both capital and labour are required for the sort of re-invention which appears to be the only way out of this dilemma, and the labour reserves required are being squandered, with nothing being done about it.
Well perhaps so -- but a better argument for alternatives is needed, and is not provided here.
http://www.techcentralstation.com/012204A.html
Article looking skeptically at the concept of open source, particularly as it relates to Linux. While it acknowledges the strengths and accomplishments of the open source movement, it also makes clear that ongoing maintenance is an intensive and costly chore. Moreover, software is a unique case where open source freedom can even attempt to apply -- it does not represent an extensible economic model.
The final claim, that open source theorists advocate a form of socialization of creation through government subsidy seems both wrong-headed and misplaced. It is by no means clear that all, if many, of the open-source community are advocating this. Nor is it clear that there is something inherently wrong with the diversion of government resources to creative community support -- it was on this basis that the InterNet itself was created, for example.
The broader, more supportable point made here: there is no one model of software development which is correct in all circumstances.
http://www.itl.nist.gov/div895/carefordisc/CDandDVDCareandHandlingGuide.pdf
The whole question of reliability of optical media in the CD/DVD format has been the subject of some vigorous and alarming discussion over the past few years -- there is no doubt that these media are not as bullet-proof as sometimes claimed. The USA's National Institute of Standards and Technology has released a white paper: "Care and Handling of CDs and DVDs — A Guide for Librarians and Archivists" which can serve as a guide to the perplexed on this subject.
Very big point -- using adhesive labels on CD's is a bad idea -- something I have been rather certain about for some time. Another excellent example of Your Tax Dollars At Work.
http://csrc.nist.gov/publications/nistpubs/800-61/sp800-61.pdf
Here is a white paper incorporating the recommendations of the National Institute of Standards and Technology incorporated in their "Computer Security Incident Handling Guide". Covering organization, handling, specific incident types, and a wealth of appendices with recommendations, questions, and supporting materials, this looks to be an excellent primer on its topic, and one which could be easily incorporated in the appropriate security curriculum.
Now this is an example of Our Tax Dollars At Work which has really done something worthwhile!
http://www.cio.com/archive/121503/jobfuture.html
Part of a section on IT futures [including software, CIO, and security], this article looks at two scenarios, one in which USA IT employment remains sizeable, and one in which it gets tossed offshore. In addition to the useful points made in the article, the commentary feedback also provides some stimulating discussion.
I happen to think that Scenation 2 is the more probable outcome, and nobody, including those who instigated it, is going to be happy with this outcome.
http://news.nationalgeographic.com/news/2004/01/0114_040114_robot.html#main
General article describing the results of an experiment to use a robot [which was in no way humaniform] to do some of the investigative tasks [hypothesization, experimentation, and interpretation] which human scientists do. The particular field was genetics, and the robot performed effectively. Some proclaim this as the first step in automating science, and others remark that true scientific genius will always be needed.
Again, I think we have heard this argument before. This certainly drives the capacity of IT as an enabler to new levels of ability, and represents a trend well worth watching. Perhaps scientists will join IT workers in the unemployment line!
And the line may get even longer if inventors have to join it -- the following article discusses a computer program called a 'Creativity Machine':
I confess myself a bit skeptical bout this, but on the other hand, it does seem to be the logical outome of work which has been ongoing in the AI field.
A more focussed application, devoted to helping scientists with literature searches, is described here, showing that the impact of IT development does not have to be negative for scientist employment prospects:
http://www.pcmag.com/article2/0,4149,1404005,00.asp
With the importance of blogs comes the importance of tools used to generate them. Here is a comparative review of the major blog tools, which has as its somewhat paradoxical Editor's Choice a tool which has a lower editor rating than the highest rated tool.
http://www.groxis.com/service/grok/
This is one of those tools that excites great extremes of veneration and vituperation, quite similar to the novel from which the tool's name came. For myself, hewing to the extreme middle, as is my wont, I consider this an interesting way to get a grip on massive amounts of information which would otherwise be impossible to assimilate.
It certainly represents a different way of looking at the InterNet, that is for sure, and deserves evaluation on those grounds alone.
A review of this product with a whole whack of links to other methods of data visualization and control can be found here:
http://www.masternewmedia.org/2003/12/17/desktop_visual_search_engine_extends.htm
http://www.mnot.net/rss/tutorial/
If you need a good tutorial explaining the basics to someone who is familiar with XML and allied Web technologies, then point your browser here. In addition to the content, there are also links to further information should you wish to investigate further.
The educational importance of RSS demands so little emphasis, it is not surprising that it is not more widely used.
http://whitepapers.comdex.com/data/detail?id=1074104558_819&type=RES&src=KA_RES
The evolutionary contest between spam and spam control techniques parallels that between virus and antivirus software, with convincing echoes of the biological eponym. This white paper, with a title after my heart: "Spam: A Many Rendered Thing; An in-Depth Look at Current Trends in Spamming Techniques" looks at the variety of new techniques spammers use to outfox filters.
There is, of course, an odd and melancholy irony to all of this -- if the spammers succeed, they will remove all reason to use e-mail, whereupon they will be broadcasting to thin air.
http://www.pcpitstop.com/gator/
A major online annoyance is material from Gator [adba Clarita] popping up when you surf, with potentially unpleasant effects on your system. The indexed URL explains all about this disservice, as well as what you can do about it.
My thought on the matter -- the Gator staff should be introduced to hungry examples of their eponym, on a one-to-one basis -- that would be fun to watch!
While the primary emphasis of this blog is definitely not on programming, since I could not code my way out of a wet antistatic bag, the fact remains that effective coding has to be one of the major building blocks to improved security. While this site is devoted to a specific book on the topic [with the highly arcane title of Secure Coding: Principles & Practices], it also contains a mailing list, and a book companion with Additional Case Studies, Checklists, Software Tools, Code Snippets, Bibliography and Links, Contributions, and Analysis of Topical Vulnerabilities.
All in all, this looks like a useful site to bookmark for those who can benefit from it.
http://personalpages.tds.net/~slambo/acronym.htm
Nothing wrong with a canonical list which fires puffed wheat meanings for common comuter acronyms [along with the real meaning of the acronym/initialism], so here is a cleanly laid out site with no advertising or popups which presents just that. I never knew so many alternative meanings were possible....
http://www.microsoft.com/mscorp/facts/
As Mark Twain said: "There are lies, and there are damn lies, and there are statistics". Microsoft is feeling sufficient competitive pressure from Linux that they have started dishing out "the facts" which show that Linux is more costly while performing less effectively than Windows. True, false? There is some educational value in using sources like these as a case study in source bias, even if the facts are correct.
This site also misses the point that many are disenchanted with Microsoft for a variety of reasons other than cost -- and indeed, I would myself support an alternative OS of proved reliability even if the running costs [apart from the conversion costs] were equal, for many of those same reasons.
http://www.opticalkeyhole.com/keyhole/html/ipv6.asp?bhcd2=1074710939
IPv6 has been referenced a number of times previously in this blog. This article discusses the current state of play in this protocol's development. Business is behind this developments, as we increasingly disperse connectivity among objects -- as the article notes, all major router manufacturers support IPv6, as do the latest major operating systems [though from this article, the status of OS X is unclear].
The impact on vendors, current connectivity initiatives (the Asian perimeter is aggressively pursuing IPv6 implementation), and speculation on InterNet connectivity resulting to replace other current connectivity measures are discussed. If as estimated, homes contain some 250 devices which could benefit from Net connections, then the need for IPv6 is as plain as the fact that the address space can easily accommodate this need.
http://www.computerworld.com/printthis/2003/0,4814,88646,00.html
In a worst-case scenario for the InterNet by 2010, the result is complete chaso. Cheer up! We likely will not reach that state, because long before then we will have suffered a 'Digital Pearl Harbor' that will show how severely we need to change. Exactly what the nature of this will be is a matter of some debate -- but we won't like it, whatever it is. One of the major casualties of the disaster will be innovation, and another will be privacy.
I really can't argue much against these predictions -- they sound all too plausible.
http://www.hardwareanalysis.com/content/article/1672/
An extensive discussion of all the wrong ways to install hardware. Given that there is so much which can go wrong even without trying, seeing what will happen if you do devote time and resources to doing it incorrectly is a useful exercise. As well as supplying some interesting troubeshooting examples, this article can also serve as an Awful Warning.
The relevance of this at first blush may appear tenuous. My first defence is to say that the site is intrinsically interesting, but my second is to ask a question: why should IT workers be less concerned with interface issues than anyone else? Apart from the programmer's specific role in interface development, IT professionals generally are in the business of making the complex simple, not only for users, but for themselves as well.
I don't think there is any evidence suggesting that we have interface issues solved correctly across the IT board [when using a computing device is as simple as using a toaster, then we will have made some success]. It behooves us then to consider examples of bad interfacing, such as those supplied on this site, from all its aspects. The intended reaction is "WHAT were they thinking of" -- that the answer was not an effective interface seems obvious, and can serve as a useful discussion starter in many aspects of IT education.
The site provides monthly and categorized archives.
http://www.pcworld.com/news/article/0,aid,114328,00.asp
A major and continuing source of security exploits is buffer overflows. AMD's 64-bit processors now add a feature called "Execution Protection", which prevents execution takeover after an overflow event. Intel is also looking at adding this technique. This looks like a major hardware solution to a persistent software problem.
http://www.linux-mag.com/2003-01/kernal_01.html
One of the more metaphysical questions relating to Linux is: "what is it?" -- this because the only portion of the OS which is truly, madly, deeply Linux is the kernel. All of the rest of the supporting 'pariphernalia' represents Something Else -- almost universally GNU, but not necessarily. The sorts of users which Linux must serve if it is to become a viable desktop alternative to Windows will suffer severe M.E.G.O. in relation to such a debate, and this is worth hoisting inboard by Linux enthusiasts.
That said, and it certainly has been, by me at least, the Linux kernel development has some of the same flavour as watching WINDOWS open into new versions, in terms of a spectator sport, and this article discusses some of the kernel foundations and what directions are likely for its future development.
http://itmanagement.earthweb.com/secu/article.php/3298191
I tend to be a gloomy gus about security issues, and probably the balance of security-related posts on this blog reflect that. However, there is a small pile of evidence accumulating suggesting that hacking attacks are having less effect and are shifting to service denial and similar exploits rather than actual theft. Improved security measures are seen as the reason behind this improvement, but this also masks the fact that the attacks are more sophisticated and coming faster on the heels of vulnerability discovery.
Something in the epidemiology models would have suggested this was the case -- so there is some cloud surrounding that silver lining after all.
One of the common responses to a lack of discussion of Macintosh issues is the response that Macs are so easy to use and so reliable that the sorts of complex trouble-shooting which are such a feature of the IBM PC world simply are not needed. This site might persuade you that the truth, while out there, is not quite so starkly black and white.
And if that does not convince you, perhaps this will:
Actually, given the minority status of Macintosh systems, and the fact that they tend to be concentrated in professional niches, having some trouble-shooting sources is in fact quite valuable for any organization teaching about them.
http://entmag.com/news/article.asp?EditorialsID=6087
While Intel's Itanium was first out the 64-bit gate, AMD's Opteron and Athlon 64 chips rapidly surpassed them in sales because they had 32-bit backwards capability. The Itanium was a dud with 32-bit applications, and developers would rather resist than port. Comes Microsoft to the rescue with a posted 32-bit driver for the Itanium 2, which supposedly corrects these problems.
Anything which leads to greater competitveness in 64-bit computing for the desktop is a good thing, in my books.
http://cl.com.com/Click?q=2c-DaaOIc0OqhKF1bqRXlAUBvCWr9RR
The indexed article discusses the important concept of secure identity management. The following materials offer other information about security:
A set of "Best Web Links" on security basics, "for those just entering the world of security", covering a wide range of topics, from biometrics to viruses, can be found here:
http://searchsecurity.techtarget.com/bestWebLinks/0,289521,sid14_tax281891,00.html
Another set of "Best Web Links" on common vulnerabilities is here:
http://searchsecurity.techtarget.com/bestWebLinks/0,289521,sid14_tax281934,00.html
Given the prevalance of Microsoft OFFICE in the workplace, some advice on locking it down for security is not amiss, and this comes from an expert, Roberta Bragg:
http://mcpmag.com/columns/article.asp?editorialsid=555
A white paper "The Secret to Simplified Firewall and VPN Security" covers a popular and significant topic:
http://searchSecurity.com/r/0,,16172,00.htm?stonesoft
Some straightforward secuity advice can be found here:
Knowing How Much Security You Need on a Windows 2000 Network
http://www.dummies.com/WileyCDA/DummiesArticle/id-1512.html
Breaking into the Basics of Network Security
http://www.dummies.com/WileyCDA/DummiesArticle/id-1808.html
Firewalls: Defending Your Network from Internet Attacks
http://www.dummies.com/WileyCDA/DummiesArticle/id-1518.html
http://www.pcworld.com/news/article/0,aid,110759,tk,dn051603X,00.asp
Ethernet is an amazing success story, given how it has been capable of adapting to successive requirements for improved bandwidth, in large part, I think, because its decentralized control model maps well onto the InterNet's similar model. This article suggests we will see 40Gb/s Ethernet in the near future.
Another, more extended look on future Ethernet speeds can be found here:
http://www.pcworld.com/news/article/0,aid,110950,tk,dn060203X,00.asp
http://mcpmag.com/columns/article.asp?editorialsid=576
The mysteries of subnetting can appear positively arcane to most networking students, so anything which could help them along is worth noting, such as this quick lesson on how-to-do-it.
http://www.usatoday.com/tech/news/2004-01-13-patentscover_x.htm
An overview article which, while providing some counter-arguments, demonstrates that the current USA patent system as it applies to software is stifling innovation and enriching parasitic legal lampreys. Many small businesses accused of infringing obscure and perhaps invalid patents cannot afford either to fight or comply, so they simply drop off the Web altogether.
When giants like Intel and IBM also express concern about this situation, it is clear that Something Must Be Done. Since my usual prescription for these sorts of situations required the painful reorganization of bodily parts, I must recuse myself.
http://whitepapers.comdex.com/data/detail?id=1036681158_105&type=RES&src=KA_RES
Security threats to e-business, both established and pending, are sufficiently high-profile to make a white paper called "Internet Security - A Defense Model for E-Business" attractive without saying a word more.
http://whitepapers.comdex.com/data/detail?id=1073494621_104&type=RES&src=KA_RES
My attitude to IM is "I never use it, Sir, it promotes rust", but there is no question that this communications facility has a major impact in many organizations. At the same time, the security issues surrounding IM ought to give any networking professional pause, if not nightmares. This white paper: "Enterprise Instant Messaging - Essential Infrastructure" can provide a starting point for efficient, effective, and safe IM.
http://whitepapers.comdex.com/data/detail?id=1052750276_21&type=RES&src=KA_RES
Wireless is different, wireless is coming on strong, and wireless poses [as has been mentioned in this blog before] major security problems. Getting a grip on where to start may be assisted by this white paper: "Understanding the Layers of Wireless LAN Security & Management", which obviously goes beyond security issues.
http://security.itworld.com/nl/security_strat/01132004/
A brief but impassioned argument for the elimination of passwords, founded on what to me seems like an eminently defensible premise: passwords simply don't work any more. The old mantra that this is just a matter of user education is becoming woefully threadbare [not least because user education is not a trivial matter in the first place]. One possible alternative is challenge/response mechanisms, like smart cards, another is biometrics.
We have gotten to the point where we can use these more advanced methodologies with relative ease, so we really should get on with it.
http://utilitycomputing.itworld.com/4829/040106moreqthana/index.html
Some of the excitement about grid computing has been conveyed in other entries in this blog, but the URL above links to a 2-page PDF report telling a different story. According to Nucleus Research, few companies have any interest in this immature technlogy, the benefits of which will take some time to realize.
The page also indexes a number of other articles on grid computing.
http://entmag.com/reports/article.asp?EditorialsID=56
I am sure it says something (and equally sure that I don't know what) about the range of products Microsoft produces and the intensity with which it upgrades them that there could be reason for a "special report" constituting a roadmap of what will be coming out of Redmond now and in the near future. For 2004, the short answer is "not much", and while this may be a bad thing for the company and its stockholders, there are a number of reasons why it could be a good thing for the rest of us.
Just as you never can have too much memory or too large a hard drive, so too, you never can have too fast a system. Here is a site devoted to speed, with coverage of broadband internetworking and system enhancements, including reviews, articles, fora, and additional information. Network tools, security, and TCP/IP are also covered, and easily accessed by a sidebar.
The site is well-laid-out and easily navigated.
http://searchwin2000.techtarget.com/originalContent/0,289142,sid1_gci942843,00.html
The top 10 just plain whacked-out IT stories of the year. These incidents show that you don't have to have a computer to make a fool out of yourself, but it really does help!
http://www.cio.com/archive/121503/et_article.html
The potential for technology to eliminate most jobs has been explored in previous articles in this blog; here is another analysis of the trends which are affecting IT jobs in particular, including offshoring and the rise of autonomous computing [also explored in other articles in this blog]. When 43% of IT budgets are spent on labour, then this suggests that something has to give.
This article also suggest that there is some case for optimism, if technologists shift to more design-oriented specialties, and if the sheer number of potential jobs [1.5 million estimated for computer software engineers, support specialists, computer and network administrators, systems analysts, and database administrators in the next 6 years] in fact comes to pass, there will still be lots of employment opportunity in North America.
One wonders who is whistling past the graveyard here.
http://www.spectrum.ieee.org/WEBONLY/publicfeature/dec03/12035com.html
A number of rubrics conveying conventional wisdom in IT go by the definition of 'law', when in many cases [like Moore's Law] they are more rule-of-thumb generalizations. This article looks at five instances of such laws, and concludes all are less than perfectly lawlike. Of those examined, it looks like "Rock's Law": Semiconductor tool costs double every 4 years, is the one which now has the least quantitive relationship to reality, being out by a factor of 5 whereas most of the others instances are only out by a factor of two.
Some, like Metcalfe's Law, are impossible to qualtify fully, and thus the degree to which they are adrift is not possible to evaluate in the same way as some of the others. Still I can propound Ox's Law: Rock's Law looks less like a law than a guess, with a perfectly straight face.
Not only is this an interesting discussion of some major factors in the IT industry, but it can also be the starting point of many fruitful investigations, whether about the eponymns of these laws or about their current and future validity.
http://www.technologymarketing.com/mc/content/article_display.jsp?vnu_content_id=2030373
A gtoup of noted computer journalists look at the past and future of the PC in an interesting and engaging article. The changing role of the computing press is also highlighted [it is clear that we have lost something in our gains in InterNet information]. There are problems to overcome, without doubt, but the PC still has a role to play at home and in the workplace, and companies can still make money in this market [one particularly valuable way to do so is to exploit an underfulfilled niche].
http://news.com.com/2100-7337-5131787.html
BIOS limitations have been increasingly throttling the potential for PC-platform developments, and computers are now available based on a new standard, Extensible Firmware Interface, which represents a considerable improvement and simplification of the pre-boot process. Despite the fact that nobody is laying on their backs, sticking all their arms and legs in the air, and yelling out "I'm a dead horse!" over this, the pressure for easy adaption to new technologies makes something like this irresistable. The fact that both Microsoft and Intel are behind this initiative won't hurt either.
It may also improve the PC's hardware competitiveness with the MAC in terms of integrative simplicity. It will certainly require an update in texts and teaching approach in basic hardware classes.
http://www.networkmagazine.com/shared/article/showArticle.jhtml?articleId=16600116&classroom=
A major effort, led by a MIT researcher, aims to add 'intelligence' to the InterNet, with the result that some of the benefits touted by autonomic computing [self-diagnoses and repair of problems] will be enjoyed by the Net itself. The difficulties in this are not understated by this article; given that the anticipated completion date is 2010, we have some time to get used to this.
Well, yes, and doesn't this sound all to familiar to those of us who read SF and have in mind any number of stories where the global network 'wakes up'? Of course, there really is nothing to worry about....
http://technologyreview.com/articles/wo_garfinkel010704.asp
A vigorous discussion explaining how IPv6 is not the solution to all our problems, and may in fact be a problem itelf. The success of NAT at staving off addressing shortages brings problems of its own. The degree to which IPv6 represents a major challenge to network architecture is not ignored, nor is the degree of progress being made in its implementation.
The page also contains links to related articles.
http://whitepapers.comdex.com/data/detail?id=1073402060_588&type=RES&src=KA_RES
E-mail spoofing is a serious problem, particularly with the development of 'phishing' scans, which use e-mail to direct victims to realistic-looking but bogus Web sites. A range of "Proposed Solutions to Address the Threat of Email Spoofing Scams" is discussed in the white paper of the same name indexed by this URL. Both prevention and cure are discussed; understanding the pros and cons of various approaches can be useful for teaching many aspects of networking as well as security.
Additionally, here is a Web site devoted to the phishing problem and what can be done to prevent it, with archives and news:
http://www.reuters.com/newsArticle.jhtml?type=technologyNews&storyID=4059246
Article on offshoring which keys on predictions that U.S. white-collar jobs such as programmers, software engineers and applications designers will be offshored at an ever-increasing [the rate is expected to double]. Multinationals, having experienced backlash about this, are keeping mum.
In fact, the following priceless remark is deeply symptomatic: "Nobody has come up with a way to spin it in a positive way." The explanation for that is simple: there is no way to spin this positively.
Moreover, as this article notes, offshoring is by no means risk-free, even if it will grow by 25%/year:
http://techupdate.zdnet.com/techupdate/stories/main/Top_10_Risks_Offshore_Outsourcing.html
Problems can range from lower cost savings than initially estimated to problems in knowedge transfer. Indeed, only the reduced cost impact of offshoring seems sufficient to balance against these risks.
Of course, none of this prevented the high and mighty, like Cary Fiorina [who does not have to worry about bread on the table today or tomorrow] giving a rather "let them eat cake" remark, and economists [who is laid end to end would never agree on anything] continue to hold that this pain is actually good for us:
http://sfgate.com/cgi-bin/article.cgi?file=/chronicle/archive/2004/01/09/MNG6C46T0M1.DTL&type=tech
I think, when I think about it, that I am rather in the economists' court here, but when people who benefit to an almost obscene degree defend a practice which makes many of their contemporaries into dispossessed paupers, I also acknowledge without hesitation that this grates, and reveals the degree to which such speakers are pompous, heartless, and self-centered.
http://news.bbc.co.uk/1/hi/technology/3292043.stm
Report on Vint Cerf's take on the past and the future of the InterNet. The fact that initiatives are underway for making telephone numbers a Net Address and creating a Naming Authority Pointer system which will allow an extensive expansion of the Net's address space beyond the DNS [for example, book ISDNs could become part of the address space] promises to increase the Net's ubiquity -- it really could reach the SF vision of an omnipresent environment in which we live, move, and have our being.
A direct quote from Cerf is, I think, a valuable corrective to some fulminations about Net content:
If you have the ivory tower view that the internet is good only if everything on it is good you are mistaken. ... The internet is a reflection of our society and that mirror is going to be reflecting what we see. If we do not like what we see in that mirror the problem is not to fix the mirror, we have to fix society.
Taken to a not-improbable extreme, this has the potential to be another of those iceberg tips about which I natter continuously.
While I mention Apple issues in this blog from time to time, nobody has stepped froward to donate the dual-processor G5 system with 23" Cinema display that I need to get hands-on experience, so Apples don't get much sauce from me. This URL indexes a blog which takes many bites at the Apple universe pie.
Like any good blog, it offers recommended readings, sources, and software links.
http://www.preshweb.co.uk/linux/howtos/dos/
While I regard DOS as such a historical relic that I do not even provide an indexing term for it in this blog, there may be times when you have to run a DOS applicaiton on a Linux system, and you don't need the cost or power of a full-blown OS emulator like VMWare. This article describes how to use a free DOS emulator along with a copy of the FreeDOS operating system, complete with active links.
Much of the specific focus is on making games work, but it is still a well-written and helpful primer for anyone wanting to run DOS under Linux.
http://www.straightdope.com/mailbag/mplurals.html
OK, I admit it, I was worng, rawng, REAUGHNG! For years I have resisted the locution 'viruses' in favour of the more euphonic 'viri'. However, I have about enough Latin to be able to say coma canensis on the day after...so this detailed, erudite, and ultimately devastating analysis of the hows and whys of proper pluralization has convinced me to the point that I make a public confession in humiliation and remorse.
From now on, the plural of 'virus' is 'viruses'. Case closed!
http://open.itworld.com/4917/040105holekernel/page_1.html
The chortling by Linux fans whenever a Windows security exploit is reported was rather muffled by publication of a serious security hole in the Linux kernel. A kernel hole is exceptionally serious as a vulnerability class, since it can allow attackers to destabilize the OS or take control of the system.
Patches to fix the hole have been made available. The point worth study here is: was there any difference between the genesis, gestation, revelation, and resolution of this security issue in Linux and a similar case in Windows? The outcome of this could be a stronger recommendation for one OS type over another, or a realization that the Linux community has been overstating the case, perhaps "more than somewhat".
http://www.nytimes.com/2004/01/03/business/03consult.html
The pros and cons of offshoring IT work have been discussed extensively in other articles indexed by this blog. The above article looks as how the phenomenon works, and demonstrates how consulting companies are using offshoring as one of the few cost-cutting recommendations which are available for implementation. As a result, the pressures for offshoring are as likely to come from those who provide organizations advice as it is from within those organizations.
This just makes this phenomenon that more difficult to combat, assuming that opposing it made strategic sense in the first place.
http://www.computerworld.com/securitytopics/security/story/0,10801,88359,00.html
From my worm's-eye view, in the security battle, it looks like the bad guys are winning -- the level of disruption I have experienced this year on the InterNet is much greater than any year previous. However, this article suggests the light at the end of the tunnel is not an oncoming e-mail virus -- that tools which will mimic an immune system will be available as the result of ongoing research, reducing the threat accordingly.
We badly need something like this -- we cannot ask all computer users to become security experts to do their daily jobs, after all.
http://www.infoworld.com/infoworld/reports/49SRiw25.html
A retrospective on 25 years' of information technology is one thing, but coupling this with an attempt to forecast the next 25 smacks of hubris. Nevertheless, the indexed article attempts just that, suggesting that the major foci for advances will be:
Pervasive computing, with consumer electronics showing the way
Computers that mimic intelligence
The invisible workforce: IT in the future: as IT becomes invisible, so do the workers who keep it that way
After silicon: Biocomputing, where organic processes become the model for future technology
In addition, some IT leaders make predictions about the future, about which one wonders if we should keep a scorecard.
Still, the future is going to happen, so we might as well expend some skull sweat in showing whether or if these sorts of predictions are wrong.
http://www.reflectivesurface.com/weblog/archives/2003/12/31/blogs_as_information_spaces
Analytic article which attempts to examine why blogs seem to take the same tired and true approach to interface. Potential alternatives, as well as the existence of inherent barriers, are discussed in a thought-provoking fashion. The concept of 'information space' is evoked as a profitable metaphor.
As the author notes, the role of entropy in this ought not be misunderstood.
As a counterpoint, here are some showcase examples of outstanding blog designs
http://www.cre8d-design.com/journal/archives/cat_blog_design_showcase.php
Seeing some of these almost makes me want to quit blogging -- but I can't -- I just enjoy it so much!
Perhaps, therefore, I should pay attention to time-saving advice for bloggers given here:
http://blogs.salon.com/0002007/2003/12/29.html#a571
http://whitepapers.comdex.com/data/detail?id=1071837167_677&type=RES&src=KA_RES
One of the advantaged to Active Directory is the much larger number of accounts which can be managed using it. Here we have a white paper: "Deployment Trends and Methods for Optimizing Large Active Directory Configurations" which speaks directly to issues relating to converting from NT 4.0 to an AD-enabled Windows version. It also looks at AD use within larger organizations, and thus looks like a useful case study.
http://www.kernelthread.com/mac/osx/
A detailed look at the Macintosh OS X, covering its history, architecture, and features. Programming for the OS and a discussion of the available software running under the OS are covered, and other sections discuss what's good about the OS and hacking tools available for working with it.
Certainly worth looking at as an independent [and positive] description of what's up with this OS.
http://news.bbc.co.uk/1/hi/technology/3340491.stm
The notion of pervasive computing, which I have referenced previously in this blog as "making the world smart" would appear to be as unexceptionable as Mom and apple pie. Enter the Philip Wylies of the computer world, who call for more social reflection on whether pervasive computing is a good thing or not.
In principle, such second thought seems like a good thing in and of itself, but I have my suspicions that in an arena as complex as this, not only are we able to pause for thought, but we really cannot adopt any useful vantage point in any case. It may well be that we will only be able to see effective coping strategies after we have suffered the deluge.
Indexes a project which should be extremely interesting to those teaching networks and operating systems -- the ADIOS objective is to provide downloadable installable OS versions where students have administrative priveleges. This looks like an excellent way for students to gain administration experience without putting real networks at risk.
The website contains a comprehensive explanation of what is involved with this project, which certainly looks worth investigation.
Perhaps nothing symbolizes the Linux-Windows divide as much as the fact that the former recompile their kernel about as much as the latter recover from crashes. While I don't think this is the sort of thing we want most home users to be have to be doing, the fact remains that for any degree of competency in Linux, familiarity with kernel building issues is a must.
Of course, a hand-built kernel implemented by someone who is savvy about hardware and security will be both fast and resistant to disasters, which is another bonus, for sure.
But I still think most people are going to want to build their own kernels about as much as they will want to change the piston rings on their cars, and that too is a fact of life.
http://whitepapers.comdex.com/data/detail?id=1070988580_381&type=RES&src=mu_ab
Another slant on the IT employment issue, looking at the question "U.S. Programmers and Analysts: Endangered or Just Wounded?". The overall picture is related to historical trends, suggesting that those in the IT industry must use strategic management of their own careers to maximize their opportunities.
Another perspective suggesting the supply of IT jobs is growing can be found here, along with a sidebar indexing a number of related employment articles:
http://www.cioupdate.com/career/article.php/328945
http://www.bankinfosecurity.com/?q=node/view/334
Well-written article on assessing the risks of a wireless network, along with an enumeration of risk management considerations. Other portions of the site discuss many aspects of security, with articles on topics ranging from Sarbanes-Oxley to Security & Privacy. I have touched on wireless security in previous postings to this blog, and this is a useful addition to such material.
A site which does for networking what "Tom's Hardware Guide" does for hardware. This site offers news, polls, reviews, features, product guide, problem solving resources, links, FAQs, and top 10 lists. "Features" includes an extensive set of reports on major network conferences and trade fairs.
http://marshallbrain.blogspot.com/2003_12_01_marshallbrain_archive.html#107294501224793568
Indexes a blog archive from Marshall Brain, who comes to the conclusion, in relation to the amount of time wasted in reparing computers: "this is just nuts". Having spent a substantial time of my holidays on exactly this issue, I can sympathize -- and this is from people who, prima facie, are supposed to know about this stuff.
And then we wonder why the mythical "average user" gives up or operates an insecure and poorly performing system. I think that Brain's main point -- that we should have a secure and easy-to-use operating system by right -- has a lot going for it, and as I have suggested before, the first manufacturer to realize this will make a bundle.
http://www.computerworld.com.au/index.php?id=2057465071&fp=16&fpid=0
According to this author, we have a bushel basket of new security challenges awaiting us in 2004, which will have to be met in organizations by improved and stricter [and unpopular] controls. Resisting the compulsion to connect, understanding that new technology developers don't put security first, and remembering that the bad guys are endlessly creative are three keys to understanding how security issues are going to play out in the future.
http://www.forio.com/outsourcing.htm
Looks at the offshoring phenomenon from the point of competitive advantage, suggesting that when the major portion of production costs comes from design, it makes less sense to outsource. The more custom-tailored a 'product' is, the more difficult it is to offshore.
The real danger, according to this article, is for firms to compete on operational effectiveness [which is what offshoring improves] alone -- this amounts to betting the farm that you are more competent than the competition. The negative effects of offshoring on innovative capacities may, in the long run, lead to the loss of even more jobs.