George Chetcuti

Quick resource utilization check on Windows 2008 Servers

There are a number of tools which give you details about resource utilization on Windows based machines. Detailed reports are best used when digging further down into an existing problem or analyzing a complex problem that requires a lot of details but an ad hoc quick performance check would suffice with few details. Actually, routine quick checks would benefit most when the results are presented visually and contain explicit values that indicate clearly the state of the resource under test. The inbuilt Data Collector Set allows you to execute a quick performance resource test with a clear and indicative overview of results.
The following steps show the simple procedure to run System Performance standard test and find if your server status is optimal or not:

From Server Manager, click and expand Diagnostics\Performance\Data Collector Sets and click System Performance – In Windows 7 computers start Performance Monitor.
Right click System Performance and select Start or click the menu green arrow image – by default, the system will gather information for 1 minute where the green arrow appears and disappears after the 1 minute time-frame.
After the minute passes, right click the System Performance node and select Latest Report – the menu green writing pad image has the same functionality.

Examine the report in particular the Resource Overview for any warnings, such as, high utilization of any component.
     

     

Additionally, by right clicking the report node, and selecting View\Performance Monitor you can load the traditional performance monitor graph and add/remove counters.
     

From the Resource Overview report above, you can easily notice the Memory utilization percentage which according to your specific environment may indicate an increase in resource utilization. A quick look at these high level counters can s

Privacy by Design – Part 2

FTC's framework proposes that business should only retain consumer data as long as there is a legitimate need. The data retention period must be reasonable and appropriate. For instance, companies tend to retain old data for long periods of time which they may consider valuable for a future need; however, consumers might have provided their private data just for the current service or product! In addition, the archived consumer private information may be prone to identity threats and if such thefts occur they may go unnoticed for long periods of time. The commission states that businesses should promptly and securely dispose of data in any form, for which they no longer have a specific business need. In principle this is an excellent measure but what if businesses relate some dummy business activity as to proof that they still need the data?

Private data accuracy is another term that is referred to in the Commission's draft. Businesses need to ensure that data collected from their customers is accurate and should take reasonable steps to verify this. Lots of things can happen with erroneous or incomplete private data. If consumers are allowed to benefit from public or private services by means of identification verification then they may gain or lose if their data is incorrect. This can cause significant harm to individuals such as when accessing funds or health benefits. Conversely, mischievous persons may take advantage of a weak system!

The draft is open for discussion and as already noted above, the concept of specific business need as regards to retain related data i

Windows Server 2008 R2 Service Pack 1

Windows Server 2008 R2 and Windows 7 Service Pack 1 provides further improvements and hardens these Operating Systems. Although, SP1 includes previous updates which many organizations and users have deployed through Windows Update, Windows Server Update Services (WSUS) or third-party patch management systems, it is quite often the practice to use a Service Pack as a baseline. That is, having successfully deployed a service pack throughout the organization creates a reference point or standard which puts your mind at rest. Some machines or even servers might have missed some updates or an administrator might have skipped some problematic updates intentionally. By time, patch management standards are likely to end up in a mess!
However, before going for a full deployment of SP1 I suggest that you test your environment. If patch management inventory is available, look for machines that lack updates with respect to others and find out why. Test the most critical machines in a test or staging environment before updating production ones. Where possible, follow Microsoft recommendations before applying SP1 and run the System Update Readiness Tool to resolve update inconsistencies. There have been issues with some devise drives, hence it is recommended to update these with the latest versions and some users are reporting SP1 installation failures with an unknown error.
As most organizations run their servers in virtualized environments, you might encounter similar problems while SP1 tries to access some virtual devices. In fact, I had to disable guest add-ons on my virtualized setup in order to be able install SP1 successfully. For more details about this error and the troubleshooting steps I performed to find the problem go here.

Microsoft’s Lync Licensing

Microsoft Lync is the latest edition (brand name) of MS Office Communications server products. As most of you may already know, Lync can be deployed locally within an organization (on-premise) or it can be purchased as a service from either Microsoft themselves or a third-party managed service provider (MSP). Briefly, to deploy Lync in your organization you need a license for each Lync Server 2010 instance and a Client Access License (CAL) for each user and device. Standalone and enterprise licensing models exist and this is similar to the other major products delivered by Microsoft. Conversely, if you go for a hosted solution then you would face a subscription licensing model which in my opinion is much simpler to handle!
The Server/CAL licensing model for on-premise implementations incorporates servers, clients and external connector components. Therefore, you need a license for each:

Server instance you will be running, whether Standard or Enterprise Servers

User accessing the servers, known as CALs and we find three types

Standard CAL – enables standard features for a user such as, IM and video & audio conferencing between internal computers
Enterprise CAL – enables enterprise features such as, extended conferencing features – External & Web
Plus CAL – enables plus features such as, VoIP features

To enable all features, a user must be licensed with all three CALs

External Connector, which is an external entity (travelling employee, business partner, etc.) connecting to your servers. There are three External Connectors which are Standard, Enterprise and Plus. External users' licenses can be purchased as CALs or ECs:

CAL – a license for each external user
EC – a license for each server (can have multiple instances) that will be accessed by an unlimited number of external users

Users' CALs (as explained in point 2

Cmdkey command-line tool

The Windows Cmdkey command creates, lists and deletes stored user names and passwords form a computer. The Cmdkey command helps administrators and security executives lists the user stored credentials and aids in finding evidence or troubleshooting remote access issues! This tool may become handy when administrators want to give users access to a shared resource for temporary use without exposing any login details. For example, a user wishes to access a shared folder /data on server \\win2k8web for temporary basis. An administrator would use a username that has access to the shared resource and either through a remote script or manually from the user workstation types:

Cmdkey /add:win2kweb /user:usernamewithrights /pass:userpassword

Where the syntax is as follows:

Cmdkey /add:<Shared resource> /user:<UserName> /pass:<Password>

Doing so, a new set of credentials are added on the user workstation without making the user aware of the username and password details! Although, a curious and slightly technical user would find the username, I suggest that the administrator would then delete these credentials when the user is ready with his temporary work by typing the following:

Cmdkey /delete:win2k8web

The delete operation denies the user access to that shared resource within the same session while the user may need to log off and log on to access the shared resource after adding the new credentials. Other examples of the cmdkey command are the following:

cmdkey /add:Servername /user:Username

Will add a Username to the current logged on user to access

Privacy by Design – Part 1

The safeguards that Federal Trade Commission (FTC) is proposing are quite reasonable and it is hard to understand why some were not implemented by the vendors in the first place. The approach of building applications and services led by security best practices would help create a safer environment. The safety measures are not just technical ones but include physical and administrative safeguards. The level of security required depend on the sensitivity of data, size and nature of the business operations and type of risks the business faces. So, what are we talking about?
For example, why Google email service is not encrypted by default and it's just an option that the end-user has to set? Google recently announced that they will make HTTPs the default protocol for their email services. The framework by FTC suggests that security controls are defined during the planning stages of an application and are revised during deployment and maintenance stages of the application. Some may argue that Google's gmail took off when cyber criminality was at its infancy, was it? Is it not the same scenario we have with Cloud service providers? How many vendors are building their infrastructure on security best practices? I am pretty sure that there are quite a number of secure cloud setups but we still lack common standards that regulate cloud computing services!
The FTC framework asserts that businesses should collect only the information needed to fulfill a specific legitimate need and nothing more! Typical example is where a local service provider collects information about unsecured wir

Google’s search algorithm changes

Another update by Google of its search ranking algorithm has caused anger amongst several businesses which according to CNNMoney had a negative impact on their websites traffic! Many businesses rely on search engines to drive traffic to their sites mainly on Google search engine as it is by far the most used engine. According to Google the goal of the new algorithm or changes is to move high-quality sites at the top of the search rankings. They (Google) were being criticized by many users that low quality sites were ranking high. It is quite normal that minor changes to the algorithm are performed on regular basis but this change was big and had immediate drastic effects,hopefully for improved search results! The IP address 64.233.179.104 would allow you to compare the old algorithm against the new one, where results from the 64.233.179.104 search would appear as they would have appeared before the latest changes. Check your website or blog, mine has lost some places but to more relevant content. It’s kind of a dirty game – some gain some lose. You find some businesses that focus entirely on SEO techniques to get their sites ranked higher and tend to forget about unique content, about the services or products they provide and about ethical issues! While Google tries to crack down on sites that try to fool the system, so often the new countermeasures manage to penalize legitimate websites as well.For more details read Google’s blog post – Finding more high-quality sites in search

Protecting Consumer Privacy

The FTC (Federal Trade Commission) has proposed a framework for Businesses and Policymakers that would protect consumer privacy while encouraging the development of innovative new products and services. The draft focuses on three main elements which are the adherence to better privacy mechanisms by businesses throughout the whole process, provide simpler and meaningful privacy options to consumers and transparency of all data practices. These are categorized as Privacy by Design, Simplified Choice and Greater transparency. I will be providing you with some of the drafted best practices in future posts but let's start by explaining further the three main elements that make up the framework.

Privacy by Design: 'Companies should promote consumer privacy throughout their organization and at every stage of the development of their products and services.'

The framework suggests that companies should deal with data security from the beginning and not as an afterthought! Security best practices should lead the development of services and products which would include data accuracy, reliability, retention and other protective features. The draft insists of proper data management procedures that must be maintained throughout the whole life cycle of a product or service.

Simplified Choice: 'Companies should simplify consumer choice.'

The draft suggests that private data collected by businesses during rational operations, such as product fulfillment can do without the privacy options and the additional related steps presented to the consumers. The draft refers to these as common acc

Customizing a Data Collector Set

As we have seen in the previous post creating a customized Data Collector Set is pretty straight forward. Go here to read the post! In addition we have seen that the data sources defined were derived from a set template. In this post we are going to see how you can add your own data sources to a previously created set:
To customize an existing Data Collector Set follow these steps:

From Data Collector Sets\User Defined of Performance Monitor select your custom set, right click and select New, then Data Collector.

From the What type of data collector would you like to create? page, type a name for your new data source, select the type and click Next.

Performance Counter Data Collector – you can add as many performance counters as you like while you can assign a sample interval
Event Trace Data Collector – you can add a number of Event Trace Providers while you can modify their properties
Configuration Data Collector – you can add registry keys that you want to monitor.
Performance Counter Alert – you can add alerts for specific thresholds bound with performance counters

Click the remaining Ok and Finish buttons to complete the procedure.

Each Data Source can be later modified from the details pane (right hand side pane of performance monitor) by right-clicking it and selecting Properties. For instance the Configuration (Data Collector) source allows you to add configuration data other than registry keys set during the creation of the data source. You can add WMI management paths, file and state capture. Data sources that are no longer needed can be deleted from the list of sources in the user defined set.

 
 

Data Collector Sets

Most Systems Administrators have used Performance Monitor to view real-time performance data on Windows servers and identify bottlenecks. Some may have also recorded sessions and later analyzed log files for performance issues. In fact, this is what I will be talking about in the couple of posts to come, mainly about Data Collector Sets. Data Collector sets gather system information, including configuration settings and performance data and store it in a data file. Before going to a brief explanation of how to create a Data Collector set, let's see some built-in features and basics:
AD Diagnostics set: this inbuilt Data Collector Set is only found in domain controllers and runs for 5 minutes. It logs data about the Kernel, Active Directory, AD registry configuration and performance counters.
LAN Diagnostics: this inbuilt Data Collector Set starts and stops manually and hence it runs until you stop it! It logs network performance counters, network configuration data and diagnostics tracing.
System Performance: this inbuilt Data Collector Set logs processor, disk memory, and network performance counters and kernel tracing. The System Performance counter stops automatically after one minute.
System Diagnostics: this inbuilt Data Collector Set logs detailed system information plus all the information included in the System Performance Data Collector Set. The System Diagnostics counter stops automatically after one minute.
Wireless Diagnostics: this inbuilt Data Collector Set is present only in computers with wireless capabilities and includes all the same information as the LAN Diagnostics set plus information relevant to troubleshooting wireless network connections. This set does not stop automatically and hence, you need to stop it.
To start a set, right click and then choose Start. If you are troubleshooting a problem, I suggest that you should try to replicate the problem if this is possible! You can view the results in the Reports node; however, right-clicki

Scroll to Top