http://entmag.com/news/article.asp?EditorialsID=6070
Article pointing out that Microsoft is going to begin the retirement process for Windows 2000 starting in April, 2004. This is certainly much too soon, in my opinion, and given W2K's considerable advantages over NT4.0, and seeing how resistant that latter OS has been to universal upgrade, I would expect similar heels-digging over W2K.
After all, if you have implemented an Active Directory installation and have it working, the chances are that any hair remaining on your head is grey or white, so why would you want to go through this all over again?
Either the difference between 2K and 2003 is so major that significant planning and BST will have to be extended on it, or else the difference is minor, in which case, why upgrade at all?
http://www.nwc.com/showitem.jhtml?docid=1424ibg12
Thin client computing is one of those ideas I regard as thoroughly bad [like a pre-breakfast cigar], but it will not go away. And since I am devoted to the principle of equal time for sensible opposition, here is an article on the current status of thin-client computing, which deals with prospects, questions, and links to manufacturers.
The record on thin clients is not particularly encouraging to date, but persistence sometimes pays off.
http://www.securityfocus.com/infocus/1752
An in-depth comparison of how three different worms (Blaster, Slammer, and Code Red I/II) impacted networks once the external security was breached. This is a useful examination not only of the effects such malware has, but also on how to create a remedial plan.
http://www.forbes.com/home/2003/12/16/cx_el_1217macmoments.html
While it is hard to believe that 20 years have passed since the introduction of the first Apple Macintosh [where does the time go?], I can well remember the first Mac's introduction and thinking "that's about half the computer I want for about twice the price" -- this at a time when the same capability for IBM clones was nowhere in sight. Alas, my initial though remained true for those 20 years too -- it always seems like Apples cost more than I want to pay for the performance I am getting -- when I look at the current Apple of my eye, the dual-processor G5 with the 23" Cinema display, I am facing a charge of some $6,000 when to buy the equivalent IBM machine would cost me about half that [apart from the display], and a much more muscular system would be the result for $6,000.
On the other hand, to be able to compute without driver issues, without crashes, without being able to go to MicroCenter and spending all my money on doodads which then stay safely packed away in their boxes because I never have time to install them, and without the same degree of attack through InterNet connections, well that has to be worth something. Still, however much I review the issue, I doubt there is a Mac in my future -- but desktop Linux looks very possible indeed.
At any rate, and to the point, the URL indexes a set of articles on the 20th anniversary of the Macintosh, and recounts key moments in the life of this technology and that of its parent company.
Here is another retrospective on the Macintosh's birthday:
http://www.pcworld.com/news/article/0,aid,114418,tk,dn012304X,00.asp
http://www.youdzone.com/signature.html
The concept of assymetric key encryption is a security issue upon which many students' [and faculty's] understanding founders. Here is a simple explanation of public key cryptography, digital certificates, and certificate authority which makes the outlines of the process somewhat easer to grasp.
While plug-n-play has taken a lot of the pain out of hardware settings, there is no sensation quite like opening an older computer box and having one or more little square black plastic bits come raining out on the worktable. And is there any documentation for this lost jumper phenomenon, which could, at its worse, result in the conversion of a PC to a passable doorstop? Silly you, that you would even think to ask such a question!
Well, here is a resource with jumper settings for over 18,000 devices, so if jumpers are piling up and threatening to submerge your ankles while you do your computer work [most disquieting phenomenon, that], hop on over and see if it does not have the documentation you need.
No sooner do I write a couple of screeds fulminating about the advantages of the Linux desktop for all and sundry than I come across, courtesy of the wonderful Lockergnome Linux Fanatics listserv, the site for the Desktop Linux Consortium, which, one strongly suspects, is a consortium of parties interested in desktop Linux.
Since it is just at the 'getting started' stage, if making the waters of Linux rise floats your boat, why not mosey on over and hop on in?
http://www.linux-mag.com/2003-09/viruses_01.html
The lesser vulnerability of Linux to viri and similar malware is a plus, but "lesser" does not mean "none at all", as this article, which discusses Linux viri, clearly indicates. It provides a good primer on how Linux can be affected by a virus, with some case studies, and it also touches on the added problems involved with running a Windows emulator.
All this means is that the shouting about the 'weaknesses' of Windows, which has come from several sides in the OS debate, increasingly comes down to the issue of a software monoculture. Even though I accept that by design and heritage Linux and OS/X are somewhat more resistant to malware writers [and I think I am toning down this appreciation a bit], the fact remains that the major reason why Windows systems are most affected is because of their majority status.
Taken to its extreme, if alternative OS deployed in homes and offices shrug off most of the Windows badness, then we have to hope that their deployment in fact does remain restricted, precisely so the 'target' offered to malware authors is not as tempting, allowing us to compute in peace.
http://www.tldp.org/HOWTO/Hardware-HOWTO/
Rejoice or regret the fact that those installing Linux must have a more intimate knowledge of hardware than their Windows/Mac counterparts, the fact remains for most distros that this is still true. The above guide is source of information about Linux and hardware compatibility. Here are some others:
http://hardware.redhat.com/hcl/
http://www.linux-mandrake.com/en/hardware.php3
http://www.eskimo.com/~lo/linux/hardwarelinks.html
Whatever your opinion on this [and I feel it is a Good Thing in those months it is safe to eat oysters, and a Bad Thing otherwise], Linux will never get the sort of home penetration that Windows 98 did until home versions 'hide the hardware weeny' as effectively as W98. Since I think there is a big market there, if properly developed, such a prospect bears careful consideration.
http://sensorsmag.com/articles/1103/22/
One drum I have been pounding in this blog, using the leg of one of the big bees drawn from the hive in my bonnet, is the implication of distributed sensors ['smart dust'] for networking and communication, as well as life, liberty, and the pursuit of happiness in general. Here is an excellent long illustrated article on the current state of play with small sensors, and what is likely to be available in the future.
Applications in several major areas are discussed, references are provided, and further readings are suggested. In general, I am prone to postpone consideration of the immediate future until it gets here [the long-term future, I have mooted previously, requires much more care and planning], but in the case of small, smart sensors, I think the issues and capabilities are so front-and-center [as well as, paradoxically, being so elusive] that it is worth considering them now.
Mighty oaks from little acorns grow -- the implications of smart sensors could be "vaster than empires" while not being slow in any sense of the word at all.
http://www.nwfusion.com/news/2003/1215ipv6.html?net
Article reporting the latest findings on IPv6, a topic mentioned several times previously in this blog. Those who have adopted it say the transition doesn't hurt anywhere near as much as the doomsayers suggest. v6 adoption in government and academe is outpacing its implementation in the business sector. It is, however, worth taking time to consider the whole process of v6 adoption, because this allows its many benefits to be implemented in the most effective manner.
The article contains a sidebar with links to related stories.
http://news.com.com/2010-7345-5121763.html
XML was originally intended to be the universal language for Web pages and Web-enabled applications, but its advent collided with the big bubble bust. Despite promises that it would become a standard for applications interchange [as, for example, made much of by Microsoft in its plans for OFFICE], it is Not There Yet. But as this article indicates, support for XML standards is growing, and it looks like the long-delayed maturation of this technology is at hand.
One thing which networking folk should regard in all of this [and indeed, originally this was front-and-center when Server 2003 was known as Server.NET] is that the sort of programming which XML represents is going to become essential for network operation as well. For those networkers working in a *NIX environment, and for Windows [and other OS] administrators who do scripting, this will not come as much of a shock, but for those, especially in smaller installations, who think of programming as a separate and arcane black art, the development of XML is indeed protentious, because, I think, it blurs the boundaries between programming and network administration.
This is no small thing.
A lean and clean set of support fora [they say 'forums', humph!] covering every major operating system and several minor ones, plus hardware, drivers, software, IT reports, how-tos, news, articles and opinions, installation guides, and a variety of site search tools.
http://www.tbray.org/ongoing/When/200x/2003/07/30/OnSearchTOC
Here is a searching series of articles on, well, searching -- a really deep delve into a topic which may seem simple, but which is not. Solving search issues in the author's opinion is essential for having computers work as real information tools, instead of being barriers to overcome. I quite agree with him -- this is a good example of the way in which the intellectual/theoretical 'plumbing' has to be put in place before any techological solution can really do much good.
http://mcpmag.com/columns/article.asp?editorialsid=530
For Server 2003, as this blog already exemplifies, there is no end of commentary -- so of course here are some more articles:
Details of setting up a Server 2003 lab:
http://mcpmag.com/features/article.asp?editorialsid=337
Now you have the lab set up, here are some exercises:
http://mcpmag.com/features/article.asp?EditorialsID=338
A Microsoft product without holes needing patching is like a golf course upon which you can never score:
http://mcpmag.com/features/article.asp?editorialsid=336
Dilettantes talk case mods, amateurs talk configuration, but professionals talk policy:
http://mcpmag.com/features/article.asp?editorialsid=328
The road goes ever onward, and Microsoft goes with it:
http://mcpmag.com/features/article.asp?editorialsid=326
If all servers were IIS 6.0, would you let one host your sister's Web pages?:
http://mcpmag.com/features/article.asp?editorialsid=330
What's .NET got to do with it, what's .NET but an overblown IDE?:
http://mcpmag.com/features/article.asp?editorialsid=307
And what all this means for MCSEs, MCSAs and all the other certification issues is covered here:
http://mcpmag.com/news/article.asp?EditorialsID=567
http://www.mcpmag.com/news/article.asp?EditorialsID=565
http://www.mcpmag.com/news/article.asp?EditorialsID=564
http://www.mcpmag.com/news/article.asp?EditorialsID=549
http://www.mcpmag.com/news/article.asp?EditorialsID=543
http://www17.tomshardware.com/mainboard/20030401/index.html
Article explaining the latest state of play in computer RAM -- a topic almost as volatile as the commodity itself. While SDRAM dominates now, in the near future, the Double Data Rate technologies will take over. Since RAM is essential to effective computer operation on all levels, it is good to have some unbiassed description of what is going on which is also up to date.
http://www.cnn.com/2003/TECH/ptech/04/15/fortune.ff.trends/index.html
While doomsayers consider the world lost in woe, there are reasons for optimism, as evidenced in technology trends to standardization (lowering costs), open source (lowering costs), wireless (breaking infrastructure barriers), 'living' data (more powerful software), and selling software as a service (cost effective applications). The position taken here that the InterNet, as implemented in these technologies, is a great leveller, has certainly been disputed by other postings in this blog -- but there is no reason why the optimistic view may not be right.
http://www.eedesign.com/story/OEG20030415S0027
The OSI model in all its gory glory is a staple of networking textbooks, but it still may provoke congnitive resistance. The URL indexes a two-part article on the model, paying particular attention to the upper layers which often get banished to the wiring closet in textbook discussions. The article also mentions some model-related issues in networking communication.
http://certcities.com/editorial/columns/story.asp?EditorialsID=144
Interesting problem article relating to DNS, which when worked out provides a good understanding of how TCP/IP protocols and DNS interact to produce results, and thus is a good teaching tool. Some of the commentary attached to the article is also educational, in its own way.
http://www.mywebattack.com/gnomeapp.php?id=106380
The URL indexes a download area for a free, lightweight network scanner which allows you to listen to TCP ports, check for shared drives, and observe a number of network issues which are worth learning about. Good for all flavours of Microsoft Windows. Even better, the download page lists similar freeware programs to try, as well as a sidebar of related categories of software.
There is hours of downloading fun here -- the real fun is trying all these out, and evaluating which works best for the particular learning application you need to support.
http://support.microsoft.com/default.aspx?scid=/servicedesks/fileversion/dllinfo.asp
'.DLL Hell' expresses a major issue in dealing with Microsofy software -- file version conflicts between different versions of dynamically shared libraries. Before tearing out your last hair, try this database which indexes information about the .DLLs shipping with Microsof products.
http://www.techreview.com/articles/sahin1203.asp
Whenever anyone asks me why I have such a swollen head, I reply: "My bonnet is full of buzzing bees". The whole issue of how innovations get fostered and aopted amounts almost to a complete hive. The impression I get that we are not using our wealth of tools wisely or well reflects, au fond, difficulties in innovation adoption. Here is an article which speaks to the state of innovation today, explaining how economic forces have produced stagnation in this area, and suggesting some potenital cures. All very interesting in and of itself, but also, I think, relevant in some way to the more specific organizational theme.
http://whitepapers.comdex.com/data/detail?id=1069861581_120&type=RES&src=KA_RES
A short white paper: "Bewafre Spyware" which gives a quick overview of this type of malware, useful for informing teachers and students alike. If people read something simple and basic about this, which looks digestible, they may be more motivated to do something about this. I would be prepared to bet a small chocolate bar that home users in the thousands still do not appreciate the spyware threat, even though they are suffering the consequences.
http://www.pcmag.com/article2/0,4149,1408953,00.asp
An article taking a gleeful chortle over the revelation of a serious security vulnerability [which would allow a Mac system to be taken over remotely] in the Macintosh OS/X Jaguar/Panther release. Mac enthisasts have been echoed by remote observers like yours truly in the assumption that the reduced vulnerablity of Macintosh systems could justify their higher purchase price.
Say it ain't so, Steve! Well, in fact, there is somewhat less to this, I think, than flashes on the screen. It may well be that protection through minority status has resulted in this flaw not being exploited as yet, but I consider it a completely valid assumption that OS/X, with its UNIX roots, is inherently less susceptible to security flaws, and the degree of OS implementation has little to do with this. This is not the same as saying the OS has no flaws, just fewer flaws, and a better way of reducing such exploits when and as they happen.
But never let it be said I was hostile to exposing opinions which differ from mine, no matter how wrong they might be....
http://www.eweek.com/article2/0,4149,1413403,00.asp
The importance of RSS as a content-distribution tool has been mentioned several times in this blog. As the indexed article indicates, however, there is a server bandwidth problem if an aggregator wants to update frequently [and the more frequent the update, the more responsive 'the digital nervous system']. The article also suggests a potential cure: BitTorrent peering which allows the RSS stream to be disaggregated and tossed around by a multitude of servers. Somehow, given the distributed character of the InterNet, this solution sounds right -- producing yet more ripples on a seemingly calm surface.
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-105-1-1-618817-4347-1
The issue of Radio Frequency Identification devices has already aroused a storm of ontroversy, as noted in earlier remarks in this blog. Like any other technology, there is a 'right way' and a 'wrong way', and the corporate take on this may not be the correct one. This article suggests some guidelines for RFID implementation, which itself is one of the ways 'we will make the world smart', and is probabaly therefore inevitable -- but we might as well get it right.
That the sailing is not completely smooth on this electronic sea can be gauged by this account of Wal-Mart's problems with RFID:
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-105-1-1-618817-4350-1
On the other hand, if the army of the USA is enthusiastic about this technology, it can't be all bad:
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-105-1-1-618817-4344-1
A more balanced examination of this technology is provided in this report from Harvard on a conference which discussed RFID issues:
http://hbswk.hbs.edu/pubitem.jhtml?id=3879&sid=-1&t=special_reports_cyber2004
http://www.cioinsight.com/article2/0,3959,1387759,00.asp
Long interview, with references, with John Patrick, a seasoned observer of business and technology, about the utility and future of blogs. The importance of blogs as an alternative form of communication is stressed. One of the powers of the computer is that it can be any tool; one of the powers of the blog is that it can take any one of a multitude of forms.
Because they are bottom-up rather than top-down, blogs can deliver on the promise of knowledge management.
http://www.esecurityplanet.com/trends/article.php/3288271
Everyone agrees that e-mail is broken, and now some fixes are being proposed. The latest concept is a technical specification enabling e-mail recipients to verify sender identity, which then could be extended into a reputation report. Experts agree that e-mail identity is the requisite first step to reform. The pros and cons of this have been highlighted in this blog, because I feel this is no small issue in the way in which the IT environment is evolving.
Despite the eloquence and the genuine case that anonymity proponents have mustered in this debate, I still find myself, somewhat uncomfortably, under the tent of the identity brigade. In some sense, this demonstrates how central e-mail has become to the computing experience of most of us.
Article summarizing the USA National Cyber Security Summit, which came up with a recommendation for more secure code and coding practices. This will involve a massive effort, requiring inter alia extensive retraining for those software developers who are already in the production stream. Similarly, current curricula must be revamped to give additional emphasis to responsible development with security in the main focus.
There is a lot more disagreement on the 'how' of this, and what the most effective model should be, but the output from this conference would not go amiss as the input to future curriculum development in software engineering [where, I must hasten to point out, I cannot claim even the thin veneer of expertise I profess in terms of networking].
http://www.computerworld.com/securitytopics/security/story/0,10801,87554,00.html
Article with two salient points of interest. One revolves around the ever-increasing capability of malware, which will only increase as hardware and software powers increase. That the bad guys appear to be winning the war suggests this pessimistic take has a lot of merit.
But hidden away on the second page of this article is an arresting little chart, which shows the date at which a computer implemented the processing power of some living organism. For example, the processing power equivalent of a bacterium was available in 1975. I was under the vague impression that we were at the insect level today, but according to this, we passed lizard equivalency in 2000, and are making strides towards the capacity of the average mouse.
While human capacities are nearly two decades away, according to this [and I suspect 'the devil is in the details', and the timespan may be longer than that], just imagine something considerably lesser -- a computer system with the responsiveness and processing power of a dog. Such a level of achievement would itself be a massive upgrade in the ability to use computers as a tool, and would be made even more impressive if we could teach such computers not to make a mess indoors....
http://www.informationweek.com/story/showArticle.jhtml?articleID=16700356
Windows 98 is finally being put out to pasture to join Windows 95 -- the key factor here being that Microsoft no longer has any obligation to release security hotfixes for either product after 16 January, 2003. I have remarked on the prevalance of 'outmoded' OS before, but to hear statistics quoted that as on the end of 2003 a whopping 27% of all installed Windows machines are W95/98 is something of a stunner. However undesirable these older Windows versions may be on performance grounds [can you spell "crash", Billy?], their 'retired' status now makes them a severe security risk if they are connected to the InterNet.
As this article notes, the softening of the economy made the traditional upgrade in 2000/01, which would have replaced W9x, much less attractive, accounting in part for this lingering OS aroma. But that was then and this is now, where [providing a monitor does not need to be purchased], perfectly reasonable desktop systems have a sticker price around $300. On this basis, it is hard to justify refusing to upgrade, particularly for connected systems, or to look at alternative OS [which, in the case of the penny-pinching, means the latest Linux release].
This is not just a matter of security, important though that may be, but also the deplorable condition of expecting knowledge workers to produce with demonstrably obsolete tools. In previous posts I have been dismissive of what I consider an overly-abbreviated product life cycle for Microsoft's 32-bit systems, but the present case is different, for two reasons:
1) In comparison to the latest Microsoft OS, or even NT, the W9x flavour simply does not work as well; and
2) The product life cycle for W95 in particular is certainly long enough.
That I am not alone in supporting this line of argument is suggested by this article:
http://eletters.wnn.ziffdavis.com/zd1/cts?d=75-105-1-1-618817-4353-1
and more extensive coverage of the implications of retiring W9x is provided here:
http://www.eweek.com/category2/0,4148,1411728,00.asp
And now, in a more recent reprieve, Microsoft has had a change of heart, so "support will now last until June 30, 2006 for Windows 98, Windows 98 SE and Windows Me [sic]":
http://entmag.com/news/article.asp?EditorialsID=6084
I do wish they would make up their minds about this! Even so, the main point remains valid -- Windows 98 is a fading technology, and companies still using it should plan for a replacement now.
http://www.globetechnology.com/servlet/story/RTGAM.20031211.wtxkapica1211/BNStory/Technology/
One of the refrains of this blog is that the current legislative regieme, particularly as it relates to intellectual property, is in fact repressive, benefits only the "haves", and contains long-term destructive effects for technology competitiveness. When it becomes more profitable to sue than to innovate, more will sue, which is like men in a leaky boat fighting over a cup of water. This article essentially agrees with me, which simply highlights the good sense and discriminating intellect of its author.
Whenever a corporate entity launches a legal challenge in the public interest, we should look very carefully to see whose ox is being gored.
http://www.techcentralstation.com/120903A.html
If the world hands you a lemon, make lemonaide -- this is the message of this suggested solution to the technical unemployment issue. Two trends: accelerated learning, taking place outside the classrom, and the requirement for personal services, are the keys to the type of work which resists offshoring. Flexibility is a prerequisite to realizing the potential for the new job market.
Another perspective on outshoring is provided by this article, which suggests that there are a number of tacks to take to secure IT jobs in the USA:
http://www.cioupdate.com/career/article.php/3116471
Related to outshoring is the capability of IT worker to work remotely, which essentially makes location irrelevant. It turns out that a number of IT jobs still require "face time", as indicated in this article:
http://www.newsfactor.com/perl/story/22822.html
Yet another article which suggests that while some IT specialties are being outshored or eliminated altogether, there is plenty of scope for growth if gthe unemployed are sufficiently flexible to take advantage of it:
http://www.ecommercetimes.com/perl/story/32280.html
Systems engineers, business process experts, and security professionals are cited as three examples of employment areas which will continue to be strong, and which resist outshoring by their nature.
http://www.usenix.org/events/sec02/full_papers/staniford/staniford.pdf
An analysis of the risks and propects for worms on the InterNet, using Code Red as a model. This paper: "How to own the Internet in your Spare Time" suggests some preventative measures which can and should be deployed.
http://www.thenetworkadministrator.com/HowToBuildACheapNOC.htm
Rather brief article, with linlks, on how to build a Network Operations Center for some $2,000 [the major cost being monitors and a medium-grade workstation, plus the cost of any Windows NOS being used]. There is a lot more to doing this than meets the eye, but it is worth investigating this, because such a NOC lab could be an excellent teaching tool.
As I have mentioned elsewhere, network management tends to lack applied focus in many educational venues. A proposal like this would go far to remedy such a lack, and would give all involved absorbing hands-on experience. This could be further extended with, for example, Cisco network management solutions, if costs could be negotiated for educational purposes.
A blog which serves as an OS tweaking site with help on a wide variety of operating systems, plus some resources for imaging and Visual Basic. Searchable with news articles and downloads. Well laid-out, with an interesting block showing the IP address you are using to contact the site, the browser being used, and what language is set in the browser.
http://www.economist.com/science/tq/
The Economist carries a good deal of weight in the dead-tree world; here is a quarterly on-line look at technology, with back issues, available for free. The major problem with a resource like this is not getting sidetracked by "oh that looks interesting" -- the main reason why it takes me 3.5 hours to look up "Cat" in the encyclopedia.
http://whitepapers.comdex.com/data/detail?id=1070907380_696&type=RES&src=KA_RES
In comparison to hardware and configuration details, network management often gets scanted in networking courses, even if it is supposed to be a curriculum item. Network management is just so theoretical and amorphous without an actual network to manage. This white paper: "Cisco - Network Management System: Best Practices White Paper" covers the 5 ISO management functional areas with reference to managing Cisco systems, and is therefore a potentially useful resource in making network management instruction manageable.
This paper on Wireless LANs may be useful as an adjunct to this: "Wireless LANs: The Essentials for Saving Your Sanity":
http://whitepapers.comdex.com/data/detail?id=1046952003_302&type=RES&src=KA_RES
The Cisco take on wireless LAN planning: "Preparing for Wireless LANs" is provided here:
http://whitepapers.comdex.com/data/detail?id=1057858029_986&type=RES&src=KA_RES
while this paper looks more closely at a major management concern: "Cisco - Change Management: Best Practices White Paper":
http://whitepapers.comdex.com/data/detail?id=1070907379_141&type=RES&src=KA_RE
and another management area is covered in this document: "Cisco - Configuration Management: Best Practices White Paper ":
http://whitepapers.comdex.com/data/detail?id=1071077473_593&type=RES&src=KA_RES
http://whitepapers.comdex.com/data/detail?id=1070473161_825&type=RES&src=KA_RES
If wireless security is not a concern, it should be; the basic WEP standard has demonstrated weaknesses, and undetected interception is so much easier with wireless that additional measures must be undertaken. This white paper: "Practical Solutions for Securing Your Wireless Network" can give you some pointers on how to reap wireless roses without security exploit thorns.
Another security paper from Cisco Systems focusses on: "Technology Best Practices for Endpoint Security":
http://whitepapers.comdex.com/data/detail?id=1070907383_68&type=RES&src=KA_RES
which introduces another layer into the security cake.
http://www.intranetjournal.com/spyware/
With the increasing prevalance of malware [computer programs foisted on you to your detriment] a good clear guide on what it is and how to deal with it certainly will not go amiss, and here is one such. In addition to being used for maintenance purposes, this is a good way to make students aware of many potential problems in computing practices they may take for granted.
Once you have worked in networking or security for a while, you take all this for granted, but for those without a technical background, this is a useful wake-up call.
Here comes that big buzzing bee again! In this case, a project under the aegis of James Burke [he of "Connections" fame], aiming to create a collaborative knowledge space on the Web which will foster new ways of thinking and improved means of knowledge management and use. I have been relentless in my crochet that we have a plethora of tools, but we just aren't using them in any way that takes full account of their potential [not least because the complexity of the tools make such potential difficult to grasp in the first place -- which is all the more argument for disciplined experimentation].
Here is the outline of a plan and a methodology which promises to redress this shortcoming; at least it provides a good place to start. If we never start, we may get there anyway, because the tsunami is sweeping towards us, but I suspect strongly we will not like the results if we take that approach.
http://entmag.com/reports/article.asp?EditorialsID=55
Despite repeated predictions for its demise, NT 4.0 continues to keep on ticking in many organizations, and this in-depth article explains why. The support deadline dates have been extended again, and many of the BackOffice bits and pieces working with NT 4.0 have also had their support extended.
As geeks are fond to remark: "If it ain't broke, fix it until it is!". This gives increased strength to the arguments that many organizations will make the leap directly from NT to Server 2003, which is an interesting datum for planners and educators alike.
http://www.businessweek.com/smallbiz/content/dec2003/sb2003122_8887.htm
Article describing an innovative approach to outshoring: hire programmers in the USA and pay them Indian/Russian-level wages. The company which tried this got deluged with applications from out-of-work programmers who were more than willing to take half-a-loaf, and all of the complexities with remote dealing went out the window.
Whether this will work more generally is open to debate, and it would seem to have a depressing effect on the standard of living in the USA [not as depressing as being unemployed, of course]. But one of the tenets of the global economy is global competition, and it may well be that we have here an ebbing tide which lowers all boats.
Article which covers threat analysis and classification for Cisco systems, providing a good primer to how things can go wrong with your network, especially if someone is actually trying to do you harm. Knowing where threats originate, and how they can be divided into different types of threats are both useful subjects for teaching network administration.
http://www.informit.com/guide/content.asp?guide=windowsserver&seq_id=36&120703
Group policies are a most valuable and powerful tool in an Windows Active Directroy Envrionment; they are also somewhat abstract and difficult to explain. Here is an article which goes into group policies and the tools used to administer them, presenting a wealth of useful background information to allow ready comprehension of this topic.
Another gem unearthed by one of my class teams. This one is long on substance rather than graphical glitz, and contains hundreds of entries relating to companies, media, organizations, programs/projects, UseNet resources, standards, and extensive cross indices and compendia to computer and communication documentation.
The site is also searchable.
http://www.sysadmincorner.com/
I have mentioned before in this blog about the joys of learning from students as well as teaching them -- here is another example. The site has thousands of classified links on operating systems, servers, networking, and resources: programming, applications, books, general information, and training & certification. The site is searchable, and contains documents, free code samples, and reviews of books and software.
You can spend a lot of time in this corner....
RSS resources exist in anbundance -- here is a selected list, thanks to Scott Finnie:
Introduction to RSS - Webreference http://www.webreference.com/authoring/languages/xml/rss/intro/
What Is RSS? - XML.com
http://www.xml.com/pub/a/2002/12/18/dive-into-xml.html
RSS Tutorial for Content Publishers and Webmasters - Mark Nottingham http://www.mnot.net/rss/tutorial/
Lockergnome's RSS Resource
http://rss.lockergnome.com/
All About RSS - Fagan Finder
http://www.faganfinder.com/search/rss.shtml
RSS Feed Reader / News Aggregators Directory - Hebig.org http://www.hebig.org/blogs/archives/main/000877.php
Top Aggregators - UserLand
http://backend.userland.com/directory/167/aggregators
RSS Readers - Weblogs Compendium
http://www.lights.com/weblogs/rss.html
The importance of RSS as a means of communicating between teachers and students [even long after the formal relationship between them no longer exists] is a potential benefit which has not been discussed in the detail I would prefer to see.
http://download.microsoft.com/download/a/2/f/a2fc47d2-8bdf-4977-8364-1f38b893dba5/lharch_pdc2003.png
How many words this picture is worth in terms of discussing Microsoft's next-generation operating system may be debatable, but if you compare this with the standard UI/Kernel diagrams published for W2K, the complexity of the new OS is enough to demand padding on your jaw, to avoid bruises as it drops. There is a lot more to this new OS [presuming that what is bruited about now in fact comes to pass, about which there is more than a little doubt] compared to previous 32-bit Windows editions.
An important component of all this is the new "Aero" user interface, about which a cornucopia of articles can be viewed here:
http://msdn.microsoft.com/longhorn/understanding/ux/
The set of communications technologies for building and running the Longhorn communictions infrastructure is described here:
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnlong/html/indigofaq1.asp
While the only appropriate colour I could render my opinions about the Next Generation Secure Computing Base would be incandescent purple, a point worth hoisting aboard, contrary to rumours, is that you won't have to have or use it:
which is fine by me -- I will give up a little security to gain a little liberty.
One last note: the hardware demands for Longhorn are somewhat steep, though not out of line with the direction computer developments are taking. Were I to be buying systems with an eye to installing Longhorn in the future, I think the extra dollars involved in specifying compatible machines now would be very well invested indeed.
Microsoft knows it faces some uphill battles -- but it does seem to be making its next-generation OS fatally attractive.
http://www.masternewmedia.org/2003/11/21/bloggers_as_independent_news_reporters.htm
Article which makes the interesting point that a major function of bloggers in the information ecosphere will be as independent reporters [effectively the exact opposite of what John Dvorak believes], giving a host of reasons why. For me, the tipping point is this: when I read something misreported in the general media because the reporter simply does not know enough to do a competent job [again, something which has been mentioned in this blog anon].
So it makes perfect sense, if I read something by someone claiming to be an expert, and that person delivers in his written material, then I am going to see that person as someone credible. With the decay of political and institutional credibility, I think that matters a lot more than many commentators have hoisted inboard to this point. In a very minor way, this blog is a point of proof -- I am claiming to some expertise in designating resources and concepts of interest to applied IT teachers -- my entire credibility lies on what use others can make of what I write.
To the tune of "Ghost Busters!" - Who you gonna believe! Good bloggers!
http://www.masternewmedia.org/2003/11/28/personal_knowledge_mapping_and_the.htm
The deafening buzzing of this particular bee in my hive-like bonnet is simply the result of intense frustration at the organizational barriers to effective knowledge use, when breaking down those barriers is a win/win situation. Here is a richly annotated article on "Personal Knowledge Mapping And The Concept Of Data Emergence" which simply bristles with insights relativeto this isuue, particularly in explaining expensive implementation failures.
There was an old motto of the Whole Earth Catalog: "We are as gods, so be better get good at it" and that applies here in spades. We have the tools, but cannot see how to apply them [nor, in truth, is such application straightforward, cheap, or easy]. But it requires some iron in the nerves, and some appreciation of "Victory, dammit!, Victory".
Keeping up with developments in IT has been aptly compared to trying to drink from a firehose -- the water is there, but there is just so much of it! Just keeping tabs on Linux developments is enough to leave your day hourless, so here is a Web site, linuxpipeline, which helps to channel the flow.
Covering news, trends, demonstrations, products, applications, and offering a free newsletter and a glossary, this site probably deserves bookmarking.
Incidentally, I was alerted to this site by Scott Finnie's Newsletter:
http://www.scotsnewsletter.com/subcenter/subscribe.asp?type=subscribe
which is a sufficiently valuable resource that I can plug it here without shame. It is free, but donations are solicited, and it is some measure of the exteem in which I hold it that someone as notoriously reluctant to part with a buck as I has nevertheless contributed cash to this in the past, and will do so in the future.
http://adtmag.com/article.asp?id=8606
While this article mentions the importance of operating system emulation software for developers, the fact is that it is equally important to educators and students [for the latter, it is a particularly elegant solution to home network training]. This article reviews and contrasts the two top contenders: VMWare, which is capable of a wide variety of emulations while being quite costly; and Virtual PC from Microsoft, which currently only emulates Microsoft's OS, but is considerably cheaper.
This software category is one with which applied IT instructors should be familiar.
http://www.cioinsight.com/article2/0,3959,1395384,00.asp
Interesting article which connects the dots among RFID, ubiquitous wireless networking, and continual downsizing of computational devices. The results could be a seamless information ecosphere to which we adjust as we move; and the timeline suggested for this is under 5 years out.
This is another example of something which just yesterday was science fiction, and tomorrow, appears to be a coming reality.
http://itmanagement.earthweb.com/career/article.php/3114871
Article reflecting a consensus among the IT employment area that the worst of the slump is over, and that better times will be coming. A Dice.com survet suggested that demands for hands-on technology professionals, particularly for C++ and Java, are ramping up.
The fly embedded in this ointment: the pool of skilled professionals is so large that it will be some time before newly-minted IT graduates will experience any sort of employment demand.
http://mcpmag.com/Features/article.asp?EditorialsID=384
In a lot of IT teaching programmes, a main student demand is for 'hands-on experience', and there is every pedagogic reason to comply. The problem is cost -- racks of servers with installed operating systems and software carry a hefty price tag. While I will argue that those institutions willing to pay this price tag give themselves a considerable competitive advantage, not all institutions involved in IT teaching will be able/willing to pay it.
Here is a detailed article with sources and suggestions on creating a low-cost laboratory for MCSA/MCSE work for about $1K per laboratory station. In addition, it offers a useful checklist of potential activities which could be employed in any program teaching network resource access management, whether directed towards certification or not.
http://whitepapers.comdex.com/data/detail?id=1069950012_176&type=RES&src=KA_RES
I must be in an exceptional mood today -- usually, if a resource requires an extensive registration [and this one certainly does], I don't feature it in these selections. It is hard enough just finding and understanding the information we need to teach IT effectively without battling additional obstacles. In this case, however, the reward is worth jumping through the hoops: a meaty, 79-page guide to the whole mare's nest of issues surrounding WINDOWS Group Policies.
Now Group Policies are one of these annoying things which are immensely powerful, extremely important for IT students to know about, and yet remarkably resistant to effective teaching, because they are so abstract in the absence of a specific environment in which to apply them. Having a guide which can make some practical sense out of this is comforting in itself, and will, no doubt, serve as a source of precept and example to assist in teaching this subject.
http://whitepapers.comdex.com/data/detail?id=1070381782_523&type=RES&src=KA_RES
There is certainly enough going on in the security world these days that having a set of useful tips on hand for vulnerability reduction can come in quite handy for practitioners and educators [the latter sometimes wearing both hats] alike. This white paper: "Best Practices for Vulnerability Management" provides some guidance of how to go about reducing your risks.
More assistance can come from the following white paper:
http://whitepapers.comdex.com/data/detail?id=1069950009_199&type=RES&src=KA_RES
which covers this topic "From Project to Process - Policy-Based Vulnerability Management".
Looking at crucial isues relating to the IT core comes from a white paper titled "Core Security", found at:
http://whitepapers.comdex.com/data/detail?id=1069861581_139&type=RES&src=KA_RES
http://www.eweek.com/article2/0,4149,1399254,00.asp
While short-term issues tend to get short shrift in this blog, I am consistent in making exceptions, so here is a prediction on Linux for 2004. The reason why it is included is the emphasis on the fact that Microsoft is pricing itself out of the market, which is certainly true, and something to consider. But I also wonder if Microsoft is not also shutting itself out of the market with its DRM and other forms of overzealous intellectual property schemes.
As the article states, grandma really may not care about all of this, but I would think the sorts of issues repeatedly raised in this blog by someone who is in no way an anti-Microsoft zealot probably give pause to others higher up the corporate food chain. As for the home market, a turnkey Linux installation which supported e-mail, Web browsing, multimedia, and a simple office suite, while hiding the gory details, certainly would bid fair to supplant XP Home.
The revelation in the article that the single most expensive component in a low-end PC is the operating system certainly represents a change in the way things are done [and if nothing else, something of a vindication of the theories George Guilder championed over a decade ago], and perhaps one the implications of which Microsoft [and indeed everyone else] has not truly read, marked, learned, and inwardly digested.
http://news.com.com/2100-7337-5112061.html
The demise of Moore's Law is one of the favourite topics for pundits to ponder -- I must see some reference to it coming to an end every other month or so. The point, as the report referenced in this article makes clear, is that we still have a couple of decades to go before we reach those limits. Anyone hardy enough to crystal ball the situation in 20 years clangs more loudly than I.
Linke to a number of related stories are appended to this one.
http://news.com.com/2100-1028_3-5112430.html
The egregious attempt by Diebold to use the DCMA to throttle criticisms of its defective electronic voting system has resulted in the company's ignominous capitulation in court. In fact, the apellants are still seeking a court order proscribing like acts in the future.
It is pleasant to see the good guys win one for a change.
http://www.japantimes.co.jp/cgi-bin/getarticle.pl5?fl20031125zg.htm
Japan has always taken a leading role in the development of robotics, with the latest evidence being a virtual avatar appointed as a digital diplomat to the ASEAN nations. Yet the overall attitude towards robotics development in Japan is to see it as a sort of dream which can inspire recovery from its current depression. The wider implications of this certainly are not easy to evaluate.
http://www.msnbc.com/news/994223.asp?0cv=KA01
The concept that IT does not matter, as indicated in a recent Harvard Business Review article discussed in previous postings in this blog, is something which is vigorously disputed in the IT community. In addition to the cited article above, which attempts to evaluate the degree to which the bloom is off the high-tech rose generally, this article:
http://www2.cio.com/analyst/report1929.html
makes the argument that even on the individual firm level, the fact that IT has become a general commodity has nothing to do with the effectiveness with which it is used, and it is the latter which generates real competitive advantage.
http://www.upi.com/view.cfm?StoryID=20031114-052912-7678r
Article which makes a point so salient I wonder others have missed it -- the savings to be gained by outshoring technical jobs is nothing like that available to companies who would like to avoid the obscene overpayments which top executives get. There is, by exact analogy to technical work, nothing magical about North American managers, and no reason why their skills should not compete on an open global market as well.
Of course the ultimate outcome of this happy prospect is the complete relocation of high-technology industry to the offshore, which will have negative implications for the USA standard of living [Hollywood can do just so much to pick up the slack, after all]. The only thing being, with the immense monetary power such executives can bring to bear on the government, will such rationalization actually be allowed to happen?
http://www.informationweek.com/story/showArticle.jhtml?articleID=16000606
Extensive article looking into the motivation of the hacker community, pointing out that it has its educational virtues as well as its criminal tinges. Knowing the motivation and activity of hackers should interest educators, especially as many hackers either get their start or remain comfortably esconced in university computer systems.
http://www.thewhir.com/features/euro-skills.cfm
Article pointing out that there was a predicted IT labour shortage in Europe too, before the tech bubble popped so loudly. In addition to reduced domestic demand, the supply of available homegrown talent has been increased in Europe consequent on tightened USA restrictions on technical immigrants. As in the USA, outshoring is also a problem, and the prosepects for a rapid revival of the European employment market are regarded as dim.
http://www.circleid.com/article/369_0_1_0_C/
Article bemoaning the inability of even the erudite popular press to get IT issues right, using as a focus an article about IPv6 and its necessity as misreported by the BBC. Indeed it often appears that the level of press information, far from abetting informed choices, confuses more than it counsels. The most disturbing point the author raises, which is one I encounter whenever I detect an error in an information source is this: if we are reading a book which addresses some things which we know, and some we do not, and the part we do know has multiple errors, how much can we rely upon that part of the exposition about which we know little or nothing?
My answer to that is fairly blunt, if not actually pungent.
http://www.linuxworld.com/story/38073.htm
Debate between two commentators on problems with the Open Source community, relating around issues of:
1) Developer redundancy
2) Propensity for Open Source feuds
3) Misdirected developer attention
4) "Us vs. Them" mentality
5) Microsoft as the Beast of Babylon
In fact, the commentators wind up essentially agreeing on 4) and 5) above; the spirited community input into the discussion section enlarges aspects of this debate further.
http://www.economist.co.uk/science/displayStory.cfm?story_id=2246018
An Economist article covering much of the ground as a number of specific past posts to this blog relating to networking security problems. Provides a good review of the major issues, and suggests methods of countering this disruption. Interestingly enough, in view of the position I have taken on this matter in previous posts, is one suggestion that outright anonymity cannot be supported on the InterNet of the future.
http://www.microsoft.com/technet/itsolutions/MSIT/Security/mssecbp.asp
Given that Microsoft's own network is a number-one target for attacks, some explanation of the principles used in that corporation to safeguard themselves is certainly worth inspection, and that is what this white paper: "Security at Microsoft", provides.
Most of the suggestions relate to using Windows 2003, but could be retrofitted to W2K systems.
http://www.macalester.edu/~fines/batch_tricks.htm
Back in the bad old days of DOS ["When men were men, and computers knew their place"], just about the only way to do any repetitive and complex computer task was through batch files programming. While I even have a book on the subject, in fact, my batch file prowess was middling -- but you did have to do it. Even though batch-file programming may now seem as useful as shoeing a horse, there are still places where you might need it, and it does have some intrinsic interest. This site has a whole whack of batch file trickery to admire and adapt.
For those wishing to reach for a more powerful tool more appropriate to GUI interfaces, the following might be worth a look:
KiXtart (http://www.kixtart.org ) is a scripting tool specifically designed for writing logon scripts
Since I do not use it myself, I cannot tell if it promotes rust or not, but those whose bent aligns this way are competent to assess it for themselves.
http://searchwin2000.techtarget.com/tipsIndex/0,289482,sid1_tax294820,00.html
A series of browseable accounts of true bloopers in the IT arena. As well as being useful to consider when developing a disaster prevention policy, these could serve as extremely useful teaching or problem-solving starters.
http://www.technewsworld.com/perl/story/32166.html
One of the myths of the PC age is that the PC has replaced the mainframe [the term "dinosaur" is thrown around in relation to the latter as if it weighed nothing]. IBM makes out like the bee which does not know it cannot fly, and being able to make mainframes sit up and do the network dance is still a good job specialty. Now the claim for grid computing suggests that the mainframe is no longer needed.
And it may even be true, though I would be inclined to wait a bit before pulling my big iron's plug.
http://www.blogstreet.com/rssecosystem.html
The place to go to find RSS tools and RSS feeds, with over 24,000 on tap as this entry is written.