Trench Tales (Part 4) – More Apple in the Enterprise

If you would like to read the other parts in this article series please go to:

Introduction

This series of articles leverages the expertise of IT pros from around the world who have shared their stories me through my role as editor of WServerNews, the world’s largest newsletter focused on system admin and security issues for Microsoft Windows Servers, and also through several other channels such as my connections with IT pros through my activities as a Microsoft Most Valuable Professional (MVP). These stories have been edited for length and clarity where needed.

Tip:  If you haven’t subscribed yet to WServerNews you should do so today!

Macs and Public Key Infrastructure

A reader told me about the following he discovered when trying to manage machine certificates on Mac computers from within his Windows Server-based PKI infrastructure:

A major issue that I found is related to x.509 certificates which is a pain as MAC do not handle security certificates similar to PC. Machine certificates on MAC mean nothing, they can be copied from one machine to another, including the private key. You can import any machine key with a private key on a Mac system.  Mac systems also allow you to change the ‘uses’ of a certificate, no matter what the intended purposes where. It is  a little frustrating  not being able to use a PKI solution because a vendor decided to ignore standards and conventions.

When I asked him if this issue had caused problems in his environment, he replied as follows:

Yes it has.  You can use machine certificates for credentials if you can’t prove it’s valid. I have copied machine certs from several window machines with private keys to a mac machine, imported them and was easily about to impersonate any of the machines for validation. As a security professional it is scary, as it changes the MIM attack to Mac in the Mix. We have to find other ways around machine validation based upon x.509 certs. Apple is claiming that everyone else is doing it wrong, and that machine certs are not really meant to be limited to a machine, but a session, but I was able to load 5 different machine certs and all worked fine and were able to be verified.

After searching a bit I found the this thread on the Windows Server Security Forum on Microsoft TechNet that might perhaps shed some more light concerning this issue.

I also found this article titled “How to request a certificate from a Microsoft Certificate Authority using the ADCertificatePayloadPlugin” in the Apple Support Knowledge Base that may also be helpful.

Anyways, I’d be interested in hearing from any other readers who may have had problems trying to integrate Macs into a Windows-based PKI environment. If that’s you, you can send me an email at [email protected] thanks.

Estimate the True Cost

The next reader runs an IT support shop for small businesses, and he had the following thoughts to share on this subject:

I work with smaller companies (a few of them have Windows Servers) and integrating Apple devices is not that bad. Compared to using Android devices, which a lot of users are also going to, it’s not worse and in many cases it’s better…

I would say that there are a few product lines going on here:

1.       Smartphones and tablets

2.       Desktops and Laptops

The smartphones and tablets work well. There may not be the management consoles for total control, but connecting iOS devices to Exchange works well.

On the OSX side, I have to say that there is a bug that a lot of people (including me) have experienced with folders disappearing (though they are still safe on the server) and it’s taken months to get a fix and it’s still not fixed. Seems that it may be related to having more than one account on an Exchange server. Hooking up OSX devices to the network still has some minor issues (I haven’t done a lot of these and don’t use the special tools you mention) but it does work.

I think that Apple though is not committed to the enterprise. Their laptops don’t have docking stations, the iMacs with built-in monitors mean that if your computer OR your monitor dies, you have to ship the WHOLE computer to get repaired. Working on the insides is a pain. It took me about 1 1/2 hours to replace a hard drive in a 2 year old iMac since the drive is behind the screen and the RF tape. On a PC it would  have taken about 5 minutes to swap out a hard drive. However, restoring a backup was easier than on a PC.

So there are pros and cons to the Macs. Fanboys say Mac is the best, but THEY don’t work in enterprise and have to do the connectivity part. They usually just use Gmail and web based programs and think they work great in an Enterprise which is not the case. Apple doesn’t have Enterprise accounting apps, etc. The Apple products are quality products though. Although the cartoon you posted from Oatmeal is funny, I don’t think it’s the case these days. A $499 iPad is as cheap as a comparable Android tablet and it has a whole infrastructure behind it that is amazing – for certain uses and maybe not always for the enterprise.

If you consider a small company, if they buy a PC for $300 less than a Mac (and that gap is closing) and they suffer a malware attack, it can take from 1 to 6 hours or $120-$600 (see below) to get them back and functional. Add that cost to a PC and you’re not MORE than the cost of a Mac.  Factor in the antivirus program you have to buy for a PC and the time to install and maintain it and the CPU cycles it steals and the Mac is suddenly not that expensive. Why $120-$600 to fix a malware infection? If a computer is completely crippled by malware and needs to be reformatted and there are no backups, it can take up to 6 hours to get everything back into place: install Windows, the 60 MS updates since that copy of Windows was released, install Office, printers, programs, data, set up email, configure LAN settings, file shares, etc. So yes, with NO full image (Ghost or TrueImage) backups, it can cost a small company owner (I know this isn’t enterprise but they can suffer a similar setback even with an in-house IT team) a lot more to keep that PC running than it does a Mac.

This reader has a good point: when you are evaluating the cost of a platform or product, you need to factor in not just the upfront costs (licensing, hours for deployment) but also the ongoing maintenance cost over the entire lifecycle of the product. That’s hard to do, but you’ve got to make a stab at doing it, otherwise you may end up spending far more money than you anticipated to the detriment of the bottom line of your business. My only arguments with the reader are that first I think if you spend some money up front on effective malware protection–or use a free product like Microsoft Security Essentials–then you can significantly reduce any anticipated costs of removing malware from infected systems. And second, if you learn how to use Microsoft Deployment Toolkit to automate the rebuilding of your Windows reference images for deployment, it won’t take much time or effort at all to wipe and reinstall an infected computer. All that’s needed is the time it takes you to learn how to use MDT and to set it up to build updated images when needed with a few clicks of your mouse. You can start learning how to do use MDT for Windows deployment here: http://technet.microsoft.com/en-us/windows/dd641427.

A Stroll Down Memory Lane

Finally, here’s an email exchange I had with another reader which wandered back into times we both wished were still around…

I used to support Apple computers in a Windows centric environment from 1991 to 2009. The information in the article is pretty accurate as far as I can remember. When I worked for Latran Technologies (former Polaroid Graphics Imaging division, which became PGI LLC), they had MACs connected to their proprietary Solaris based imaging systems as well as to the PC servers. When, OSx was used instead of the older MAC OS, I turned off Appletalk and the Filesharing for the Macintosh on my old NT servers, which we were still using. The MACs by then were connected using SAMBA to the PC Windows network. I can say that this really did speed things up because Appletalk is awfully slow! The imagers were connected to PCs, which ran Harlequin RIP 4.5 and up. Initially these RIPS ‘published’ on the network as themselves and the Apple computers connected to them as Appletalk devices. The PCs saw them as Windows print spoolers and connected to them as well. Later on when OSx came along, the RIPS became regular IP printers, which really saved the network.

In the earlier years, my family ran a graphics company. I supported their networked environment, and we had to connect up a RIP to the network that both PCs and Apple computers would connect to. This RIP used Appletalk even though it was a PC running Windows 3.1. This was not 3.11 or Workgroups – it was Windows 3.1! Underneath the Windows, was a special driver provided by COPS, or Cooperative Printing Solutions. COPS drivers were very heavy on the system, and relied on Windows Networking, which was then yet another thing on the system. Remember back in 1992, there wasn’t much in the way of RAM on PCs.The old RIP machine had only 640k of RAM, and by the time everything loaded, there was only 300K left for the program. The COPS drivers were bound to a 3COM 3C509. No other network card was supported, and the drivers had to be manually pointed to this card by editing the binder and driver ini file. In this file, the card interrupt had to be added along with the protocol used to “see” the card. This was all read into the system by the Harlequin RIP software, which emulated a Linotronics 300 imagesetter. This proprietary “box” drove a PEL box ECRM VR30 film recorder. The postscript files were ‘ripped’ to the imagesetter and printed out on film. We also had to run COPS network drivers on the Windows PCs so they could connect to the RIP. This added one more thing on the Windows PCs and made them slower still. As time went on, we were able to share the RIP as an Appletalk printer on our NT server. This sped up things quite a bit because we no longer needed to run COPS on the PCs.

It’s funny when I think about it now, but I always wondered how everything worked so well because so little RAM was provided for the RIP software. The old RIP carried on quite well until 2008 when my brother could no longer get film for the imagesetter. In 2010 he gave the system to a friend who owned a print shop. The power supply died in the system, and the hard drive was dying. I made a copy of the hard drive, which was a fast Quantum 500 MB drive, and saved the complete install before everything went down. You got to love old Windows. A complete install was 50 MB, and everything fit easily on a CD-ROM. We then copied everything to a new hard drive and plugged it into the system. No install was needed! Initially we tried loading the system on to a faster P4, but the Bus was too fast and Windows crashed. In the end we ended up with “franken-machine” by finding an ancient power supply and a newer hard drive, for the old system which surprisingly the old ‘486 supported. The old system, by the way, is an EISA bus system! Trying to find the old EISA bus interrupt disk was quite the chore! Anyway, the old ECRM VR30 and RIP lives on now, still running Windows 3.1, running COPS, and running Harlequin RIP 2.5.

I replied to him that those days of Win 3.1 and AppleTalk were indeed good times, and he continued:

They really were good times. I actually miss those days. The computers and devices were like puzzles, and trying to get things to talk to each other became a game of one up with the devices. It’s not like this today because everything seems to work together nicely behind a pretty GUI. I can say I learned a lot from my old days in computers. I initially came from a hardware environment. A world of video terminals, printers, and modems. I used to fix them to the component level, and worked as an engineering technician at one point for Visual Technology. The old stuff was made so well and was really rock solid. I would say much more so than today. Today we have pretty computers that I think anyone can do the tech job with. I think this cuts down the role of the technician and network administrator to a much lower level than it was even a few years ago. The parts are throw away, even if they are repairable, and the network admin’s job is no longer needed because everything is wizards that the users can use to solve their problems.

Moral of the story: things may get better, but they’re sometimes not as much fun as they were in the good old days.

Conclusion

Send me an email if you’d like to contribute your own troubleshooting “trench tale” for an upcoming issue of WServerNews or for a future article here in my column on WindowsNetworking.com.

Cheers, Mitch Tulloch
Senior Editor, WServerNews

If you would like to read the other parts in this article series please go to:

About The Author

Leave a Comment

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Scroll to Top