Quantcast
Channel: TechNet Blogs
Viewing all 36188 articles
Browse latest View live

Gartner positions Microsoft as a leader in the Magic Quadrant for Operational Database Management Systems

$
0
0

Microsoft is placed furthest in vision and highest for ability to execute within the Leaders Quadrant.

By T.K. “Ranga” Rengarajan

With the release of SQL Server 2014, the cornerstone of Microsoft’s data platform, we have continued to add more value to what customers are already buying.  Innovations like workload optimized in-memory technology, advanced security, high availability for mission critical workloads are built-in instead of requiring expensive add-ons. We have long maintained that customers need choice and flexibility to navigate this mobile-first, cloud-first world and that Microsoft is uniquely equipped to deliver on that vision in both trusted environments on-premises and in the cloud.

Industry analysts have taken note of our efforts and we are excited to share Gartner has positioned Microsoft as a Leader, for the third year in a row, in the Magic Quadrant for Operational Database Management Systems. Microsoft is placed furthest in vision and highest for ability to execute within the Leaders Quadrant.

Given customers are trying to do more with data than ever before across a variety of data types, at large volumes, the complexity of managing and gaining meaningful insights from the data continues to grow.  One of the key design points in Microsoft data strategy is ensuring ease of use in addition to solving complex customer problems. For example, you can now manage both structured and unstructured data through the simplicity of T-SQL rather than requiring a mastery in Hadoop and MapReduce technologies. This is just one of many examples of how Microsoft values ease of use as a design point. 

Gartner also recognizes Microsoft as a leader in the Magic Quadrant for Business Intelligence and Analytics Platforms and placed Microsoft as a leader in the Magic Quadrant for Data Warehouse Database Management Systems – recognizing Microsoft’s completeness of vision and ability to execute in the data warehouse market.

Offering only one piece of the data puzzle isn’t enough to satisfy all the different scenarios in today’s environments and workloads. Our commitment is to make it easy for customers to capture and manage data and to transform and analyze that data for new insights.

Being named a leader in Operational DBMS, BI & Analytics Platforms, and DW DBMS Magic Quadrants is incredibly important to us: We believe it validates Microsoft is delivering a comprehensive platform that ensures every organization, every team and every individual is empowered to do more and achieve more because of the data at their fingertips.

You can download a trial of SQL Server 2014 or SQL Server 2014 today on premises, or get up and running in minutes in the cloud. For more details on Microsoft Azure’s data and analytics services, as well as a free trial, visit http://azure.microsoft.com/en-us/

*The above graphic was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available upon request. Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.


Lunch Break, ep. 8: Rajesh Jha, Corp. VP, Office 365 & Exchange (part 2)

$
0
0

This week’s episode of the “Lunch Break” series is part 2 of my discussion with Rajesh Jha, the Corp. VP responsible for Office 365 and Exchange.

In part two of our convo we discuss:

  • Rajesh’s first job out of college 25 years ago.
  • Why the importance of managed e-mail is becoming such a popular topic.
  • The arguably tense first meeting we had together.
  • How to balance user empowerment and data security.
  • The advice Rajesh would give himself today on the first day of college.

Next week, I ride with Prof. Ed Lazowska from the University of Washington’s Computer Science & Engineering department.  Prof. Lazowska is a genuine pioneer in the computer science community, and it was an honor to get to spend some time together.

You can subscribe to these videos here.

Machine Learning Forms the Core of Data Science

$
0
0

This guest post is by the faculty of our Data Science MOOC, Dr. Stephen Elston, Managing Director at Quantia Analytics & Professor Cynthia Rudin from M.I.T.

Machine learning forms the core of data science and predictive analytics. Creating good ML models is a complex but satisfying undertaking. Aspiring data scientists can improve their ML knowledge and skills with this edX course. In the course, we help you build ML skills by investigating several comprehensive examples. It’s still not too late to sign up.

The faculty are available for a live office hour on Oct 19th to answer all your questions – register here.

Creating good ML models is a multi-faceted process, one that involves several steps, including:

  • Understanding the problem space. To have impact, ML models must deliver useful and actionable results. As a data scientist, you must develop an understanding of which results will be useful to your customers.

  • Prepare the data for analysis. We discussed this process in a previous blog post.

  • Explore and understand the structure of the data. This too was discussed in a previous blog post.

  • Find a set of features. Good feature engineering is essential to creating accurate ML models that generalize well. Feature engineering requires both an understanding of the structure of the data and the domain. Improving feature engineering often produces greater improvements in model performance than changes in parameters or even the exact choice of model.

  • Select a model. The nature of the problem and the structure of the data are the primary considerations in model selection.

  • Evaluate the performance of the model. Careful and systematic evaluation of model performance suggests ideas for improvement.

  • Cross validate the model. Once you have a promising model you should perform cross validation on the model. Cross validation helps to ensure that your model will generalize.

  • Publish the model and present results in an actionable manner. To add value, ML model results must be presented in a manner that users can understand and use.

These steps are preformed iteratively. The results of each step suggest improvements in previous steps. There is no linear path through this process.

Let’s look at a simplified example. The figure below shows the workflow of a ML model applied to a building energy efficiency data set. This workflow is created using the drag and drop tools available in the Microsoft Azure ML Studio.

You can find this dataset as a sample in Azure ML Studio, or it can be downloaded from the UCI Machine Learning Repository. These data are discussed in the paper by A. Tsanas, A. Xifara: 'Accurate quantitative estimation of energy performance of residential buildings using statistical machine learning tools', Energy and Buildings, Vol. 49, pp. 560-567, 2012.

These data contain eight physical characteristics of 768 simulated buildings. These features are used to predict the buildings’ heating load and cooling load, measures of energy efficiency. We will construct an ML model to predict a building’s heating load. The ability to predict a building’s energy efficiency is valuable in a number of circumstances. For example, architects may need to compare the energy efficiency of several building designs before selecting a final approach.

The first five modules in this workflow prepare the data for visualization and analysis. We discussed the preparation and visualization of these data in previous posts (see links above). Following the normalization of the numeric features, we use a Project Columns module to select the label and feature columns for computing the ML model.

The data are split into training and testing subsets. The testing subset is used to test or score the model, to measure the model’s performance. Note that, ideally, we should split this data three ways, to produce a training, testing and validation data set. The validation set is held back until we are ready to perform a final cross validation.

A decision forest regression model is defined, trained and scored. The scored labels are used to evaluate the model. A number of performance metrics are computed using the Evaluate Model module. The results produced by this module are shown in the figure below.

These results look promising. In particular, the Relative Absolute Error and the Relative Squared Error are fairly small. These values indicate the model residuals (errors) are relatively small compared to the values of the original label.

Having good performance metrics is certainly promising, but these aggregate metrics can hide many modeling problems. Graphical methods are ideal to explore model performance in depth. The figure below shows one such plot.

This plot shows the model residuals vs. the building heating load. Ideally, these residuals or errors should be independent of the variable being predicted. The plotted values have been conditioned by the overall height feature. Such conditioned plots help us identify subtle structure in the model residuals.

There is little systematic structure in the residuals. The distribution of the residuals is generally similar across the range of heating load values. This lack of structure is because of consistent model performance.

However, our eyes are drawn to a number of outliers in these residuals. Outliers are prominent in the upper and lower left quadrants of this plot, above about 1.0 and below about -0.8 for heating loads below 20. Notice that some of the outliers are for each of the two possible values of the overall height feature.

One possibility is that some of these outliers represent mis-coded data. Could the values of overall height have been reversed? Could there simply be erroneous values of heating load or of one of the other features? Only a careful evaluation, often using other plots, can tell. Once the data are corrected, a new model can be computed and evaluated. This iterative process shows how good ML models are created.

This plot was created using the ggplot2 package with the following R code running in an Azure ML Execute R Script module:

frame1 <- maml.mapInputPort(1)

## Compute the model residuals
frame1$Resids <- frame1$HeatingLoad - frame1$ScoredLable

## Plot of residuals vs HeatingLoad.
library(ggplot2)
ggplot(frame1, aes(x = HeatingLoad, y = Resids , 
                   by = OverallHeight)) +
    geom_point(aes(color = OverallHeight)) +
    xlab("Heating Load") + ylab("Residuals") +
    ggtitle("Residuals vs Heating Load") +
    theme(text = element_text(size=20))

Alternatively, we could have generated a similar plot using Python tools in an Execute Python Script Module in Azure ML:

def azureml_main(frame1): 
    import matplotlib
    matplotlib.use('agg')  # Set graphics backend
    import matplotlib.pyplot as plt

## Compute the residuals
    frame1['Resids'] = frame1['Heating Load'] - frame1['Scored Labels']     

## Create data frames by Overall Height   
    temp1 = frame1.ix[frame1['Overall Height'] < 0.5]    
    temp2 = frame1.ix[frame1['Overall Height'] > 0.5]      

## Create a scatter plot of residuals vs Heating Load.
    fig = plt.figure(figsize=(9, 9))
    ax = fig.gca()
    temp1.plot(kind = 'scatter', x = 'Heating Load',  y = 'Resids', 
               c = 'DarkBlue', alpha = 0.3, ax = ax)
    temp2.plot(kind = 'scatter', x = 'Heating Load', y = 'Resids', 
               c = 'Red', alpha = 0.3, ax = ax)
    plt.show()
    fig.savefig('Resids.png')

Developing the knowledge and skills to apply ML is essential to becoming a data scientist. Learning how to apply and evaluate the performance of ML models is a multi-faceted subject – we hope you enjoy the process. 

Stephen & Cynthia

(Cloud) Tip of the Day: What's New in Azure Networking

$
0
0

Earlier at Ignite, we announced a number of widely anticipated and exciting new features for Azure Networking.

  • ExpressRoute and ExpressRoute Premium Add-on
  • ExpressRoute for Office 365 and Skype for Business Enterprise Voice
  • New VPN Gateway offers Site-to-Site VPN and ExpressRoute coexistence
  • User Defined Routes
  • Reserved IP Mobility
  • PIP DNS name association
  • Multiple VIPs per cloud service
  • New Network Virtual Appliance partners
  • Azure DNS
  • Networking support for Azure Resource Manager

Check out the full list and descriptions on the Azure blog.

Sexta-Feira - Atualização Internacional - Conheça o novo Windows 2016.

$
0
0
Bem-vindos a nossa atualização internacional da comunidade Wiki Ninja.

Hoje nosso post é especial pois vamos falar sobre o novo Windows 2016.

Sabemos que a plataforma Windows 2016 é uma das mais novas tecnologia da Microsoft. Além dos tradicionais serviços como o catálogo de usuários - "Active Directory" suportado e gerenciado pela ferramenta, ela atua em diversas outras frentes. Vale lembrar que a proximidade de muitas empresas com o serviço em Cloud - "Nuvem", vários segmentos de negócio são administrados com o Microsoft Windows. Sem dúvida alguma essa incrível plataforma é uma das mais robustas tecnologias que estarão na vanguarda de estratégias corporativas.

Para ter uma visão sobre as inovações do Windows 2016 assista o vídeo.




Microsoft Virtual Academy - MVA

Para quem está iniciando na carreira de infraestrutura ou mesmo necessita se atualizar a nossa recomendação é o MVA sobre o Windows 2016.

Visite o endereço: What's New in Windows Server 2016 Preview

Você encontrará diversos módulos distribuídos pelos vários recursos do Windows.

O download pode ser realizado em:

https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-technical-preview


TechNet Wiki

Além do treinamento gratuito MVA a comunidade pode contar também com a ajuda dos colaboradores que escreveram diversos artigos em nosso Wiki.

Na sequência você encontrará uma coletânea interessante com vários recursos sobre a plataforma.

Windows Server 2016 CTP: ADDS Powershell Cmdlets

Windows Server 2016 Remote Desktop Services: Introducing Personal Session Desktops

Experience guide for Enabling OpenGL Support for vGPU in Server 2016

Windows Server 2016 Remote Desktop Services: Introducing Personal Session Desktops


Esperamos contribuir com seu crescimento profissional compartilhando essas interessantes informações sobre o Windows 2016.

Compartilhe seu conhecimento na comunidade.

Obrigado pela oportunidade.

Wiki Ninja Hezequias Vasconcelos

Co IT-týden dal: Vzpomínky na dovolenou

$
0
0
Kanty> Začínám mít pocit, že naše rodinné dovolené jsou nějaké zakleté. Co cesta za hranice, to nějaký zdravotní průšvih. Když jsme před pár roky odlétali poprvé, žena nastupovala do letadla s horečkou. Za tři dny ji moře vyléčilo, ale přece jen, začátek dovolené nestál za moc. Pravidelní čtenáři TN blogu si asi vzpomenou, jak dopadla naše loňská...(read more)

Page File - The definitive guide

$
0
0

Hello!

Today I will share with you my best practices for configuring the paging file in Windows Server 2008 and 2012.

Paging file seems to be a very popular subject, as we get questions about it all the time. Many customers are configuring the paging file incorrectly, based on outdated rules-of-thumb that are no longer applied to modern operating systems like Windows Server 2008 and above. 

Memory, Paging and Paging file

Let's start with the basics: Windows memory management is based on Virtual Memory, where each process has its own private virtual address space. Windows will move the least used memory pages to a hidden file called the page file, when approaching a low memory condition.

The Page file is a special file used by Windows to store modified pages, and the process of moving pages from RAM to the Page file is called "Paging".

Page files have two primary roles:

  • Physical extensions of RAM that store modified data
  • Record information about the state of the system in case of a system crashes

I will explain the size requirements for each role.


Physical extensions of RAM

If your server exhausted all available RAM and you don't have a page file, applications will crash or hang because Windows is unable to allocate more memory. Even worse, In some cases Windows itself can become unstable.

Windows Commit Limit, also known as system commit, is the sum of current paging files’ size and the physical memory that Windows can use to allocate memory. For example: If you have a 16 GB RAM and a 16 GB paging file, then your commit limit is 32 GB.

The Commit Limit can be increased by either changing the current paging files` size, adding new paging files or by adding more RAM. 

Crash dump size

On business-critical servers it's recommended to configure the server to capture memory dumps for analysis. Windows is using the paging file as a placeholder for memory dumps, meaning that Windows is writing the crash dump first in the page file, and then the SMSS process copies it to a different memory dump file. 

This have an effect on the page file size, because it needs to accommodate all the information Windows recording during a crash, more on this later.  

Sizing the paging file

The old rules of thumb (Page file size = RAM * 1.5 or RAM * 2) makes no sense in modern systems, where the logic should be: the more RAM you have, the less you need paging file.


So, how should you size your Page File?


Well, we don't have a magical value that will fit any system and any workload. It really depends on the specific workload and the type of server.

When sizing the page file we need to consider our applications memory needs and crash dump settings.

How do you know how much memory your application needs? The best way is to take a baseline.

  1. Run Performance Monitor (Perfmon)
  2. Go to Data Collector Sets\User Defined
  3. Right click on User Definedand select New
  4. Select Create Manually and next
  5. Check Performance counter
  6. Add the following counters:
      • Memory\Committed Bytes - Committed Bytes is the amount of committed virtual memory, in bytes.
      • Memory\Committed Limit - Amount of virtual memory that can be committed without having to extend the paging file
      • Memory\% Committed BytesIn Use - Ratio of Memory\Committed Bytes to the Memory\Commit Limit 


Note: Make sure you collect the information over a long period (one week at least), and the server is running at peak usage.

The page file size formula should be:

RAM size - (Max value of Committed Bytes + additional 20% buffer to accommodate any workload bursts)

For example: If the server has 24 GB RAM and the maximum of Committed Bytes is 22 GB, then the recommended page file will be: (24-22) *1.2 = 2.4 GB

What about the second factor: the size we need to record information when the system crashes?


The size of the memory dump is determined by it's type:

  • Complete Memory Dump RAM Size + 257 MB
  • Kernel Memory Dump The amount of kernel-mode memory in use (on 32-bit maximum is 2 GB, on 64-bit the maximum can go up until 8 TB)
  • Small Memory Dump  64KB – 512 KB

In most cases the Kernel Memory Dump is good enough for root cause analysis, as Complete Memory Dump is only required in specific cases, for example you want to see what happened in the user mode.

From my experience, the size for Kernel Memory Dump is usually the following:

  • On System with up to 256GB RAM =  8-12 GB size for Kernel Memory dump
  • On System with up to 1.5TB RAM = 8-32 GB size forKernel Memory dump

However, these numbers are NOT a Microsoft official recommendation, and may be different on your servers so always test before you apply.

Where to put the page file?

Starting from Windows 2008 and above you can place the page file on any partition, not just the system partition.

From the performance perspective you would benefit only if Windows is using the page file extensively (Commit Bytes is greater than the size of RAM).

Beside performance, disk snapshots and disk replication are also good reasons to move the page file to another partition on a different disk.

In this case you can consider moving the page file to another partition, just make sure the partition is running on a different physical disk (spindle), not on different partition on the same disk.  

Configuring the page file settings

You can configure the page file by using System Properties:

  1. Run sysdm.cpl
  2. Go to Advanced
  3. Select Settings under Performance
  4. Go to Advanced (again)
  5. Select Change under Virtual Memory

Let's explore the different options:


 

  • System managed– This is the default option and recommended by Microsoft
  • Custom Size– Allows you to manually configure the size of the paging file
  • No paging file– Configure the system for not having a page file

When using System Managed, Windows determines the size of the page file based on the amount of physical memory:

  • Less than 1 GB of RAM
    • The minimum size will be 1.5 * RAM, the maximum size will be 3 * RAM
  • 1 GB of RAM or more
    • The minimum size will be 1 * RAM, the maximum size will be 3 * RAM

On 32-bit systems the maximum size of the page file will be 4 GB, a relatively small size on modern systems. On 64-bit systems, for example an SQL server with 128 GB of RAM, the maximum page file size will be 384 GB. That doesn't mean that Windows will generate automatically  384 GB file, just if the system need this amount of Virtual Memory.


System Managed is the default option, and while it allows you to run the system without worrying about the size of the paging file in most situations, there are two potential issues:

  1. Disk Space– On systems with loads of RAM the page file size will be huge
  2. Page file Fragmentation– If the paging file expands and shrinks it can cause disk fragmentation and poor performance

This is the reason I usually recommend to manually configure the paging file on this kind of servers (High amount of RAM).


Starting from Windows 2012 there is a new behavior that designed to reduce the size of the paging file on systems with a large amount of RAM. When using System Managed and Automatic memory dump is enabled (by default) the maximum size will still be 3 * RAM, but the minimum size will be determined by several factors, such as size of RAM, committed memory history and free disk space, resulting in a much smaller paging file (less than 1 GB usually).


 
No paging file

As we discussed there is no need to disable the page file, even if your server has plenty of RAM.

There are two additional special cases regarding the need for page file:

  • Application requirements: Domain Controllers, DFS Replication, Certificate and ADAM/LDS Servers are not supported without a page file.
  • Guest VM with Dynamic Memory enabled (Hyper-V)Dynamic Memory requires that the VM will have a page file

 

As always voice your opinions and ideas in the comments

 

Use PowerShell to Collect Network Traces—The Video Part 1

$
0
0

Summary: Ed Wilson, Microsoft Scripting Guy, presents a video that shows how to use Windows PowerShell to collect network traces.

Microsoft Scripting Guy, Ed Wilson, is here. Today I'm present a video where I show how to use Windows PowerShell to collect network traces. The steps I show are common commands that will normally be run to set up and collect network tracing. I talk about the following:

  • Creating the session
  • Adding the provider
  • Starting the session
  • Ending the session
  • Importing the log
  • Removing the session

   Note  For more information about this technique, see Use PowerShell to Parse Network Trace Logs.

Here is the video:  

(Please visit the site to view this video)

Here is a link to the video from YouTube if you would like to download it or play it offline in a different video player: Introduction to PowerShell network tracing by Ed Wilson.

Join me tomorrow when I will talk about more cool Windows PowerShell stuff.

I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

Ed Wilson, Microsoft Scripting Guy

Windows PowerShell, Scripting Guy!, Networking, Troubleshooting, Video


Tip o’ the Week #298 – Searching and finding

$
0
0

clip_image002Who keeps an up-to-date browser favourites list these days? Most people seem to find web sites by Binging/Googling (other search engines are available(!), though some of the pioneers are no longer around) for the site they know about, rather than in trying to keep a link that might change. This relates to the filing vs. piling analogy of document and email retention, which has been covered before (here).

[The precis is that some people find or recall things by where they are, like in a folder specific to that customer or project, whereas others might have a massive pile of unsorted stuff, but they can recover items within it by remembering key words or attributes, and searching the contents]

You’d think that by now, we’d all be experts at plugging queries into search engines, maybe even doing so before posting stupid stuff on Facebook. Hint – if anything looks dodgy or unbelievable, try searching snopes.com. Please.

Anyway, here are some tips for getting more accurate searches, in a few different places…

Outlook

Did you know you can direct specific clip_image004search criteria through Outlook’s Search pane? Click on the search box at the top of a folder and you will see the Search menu appear (or the ribbon will automatically show you the Search pane, depending on how you’ve got views set up). If you click on a criterion (like From), then Outlook will build the query for you in the search box, so you can see what it’s doing.

It’s possible to jump a little though – instead of clicking From then editing, you could just type from:Paul to search for all mail sent by anyone with Paul in their name, or try using a combo of other attributes (there are many – see more here), (eg. to: Paul sent: last week). Lots more example tips here.

Yammer

For many users, Yammer is a great conversational and collaboration tool, clip_image006but even if you don’t use it frequently to post content, it can be a brilliant way of searching for answers to frequently asked questions, that you might not get via email if you aren’t on the right DL.

Thing is, Yammer’s search tends to be a bit overly inclusive – if you enter several terms then you might have one or two more results than you’d expect.

clip_image008Adding quotes around phrases (“surface 4” “release date”) helps a bit, but it will still search for any occurrence of either phrase, but by adding a + sign to each word or phrase changes the search from clip_image010an OR to an AND (ie show results with all rather than any of the phrases).

Bing

If you’re looking to trim the results you get from a web search – either carried out from the Bing homepage, or from the address bar in your browser (assuming Bing is the default search engine)– there are a few operators that it’s worth remembering. Adding site:<url> to your query means you’ll only get stuff from there, so it may be quicker to use Bing to search a given site than to go to that site itself and search from within.

Eg. Try this querysite:engadget.com Lumia –iphone, will show results from the Engadget site regarding Lumia phones, that don’t mention iPhones: not too many results there. Try that same query as a web search rather than news (here), and you’ll notice a few pages in other languages. You could try filtering more by language (here). You can also stack site: clauses with an OR (must be capitals) operator, so you could say “Jenson Button” (site:bbc.co.uk OR site:pistonheads.com).

If you’re after particular types of content, you might want to throw the filetype: operator in, eg Azure filetype:pptx site:microsot.com. For more details on the kinds of operators Bing supports, see here.

Google users can find some search tips here, too.

Security Focus: Defending PowerShell with the Anti-Malware Scan Interface (AMSI)

$
0
0

Naturally, I was intrigued when I heard that some new anti-virus and anti-malware capabilities were coming to PowerShell in the form of...

 

The Anti-Malware Scan Interface

As we know, PowerShell is an incredibly powerful administration and automation tool, but that same power can be wielded by the bad guys. In fact, PowerShell has proved to be a popular propagation and persistence mechanism. The fact that a payload can exist just in memory and can be obfuscated has made detection challenging. This is where AMSI comes into its own...

AV vendors have to emulate each script host, e.g. PowerShell, VBScript, to attempt capture the bad stuff. They have to write code to detect and undo obfuscation techniques employed, i.e. unpick the steps the bad guys use to hide their payloads. This is complicated and expensive. Wouldn't it be great if there was an interface an application could submit content to for a scan?

And, here's what AMSI allows us to do:

  • evaluate code just prior to execution by the script host
  • and, therefore, evaluate code after all the obfuscation has been stripped away

 

Furthermore, because we submit the code prior to execution by the script host it doesn't matter if it's come from on disk or just resides in memory. This overcomes another limitation of the traditional AV approach, i.e. focus on file system activity.

Here's a very simple example of AMSI in action. I have a script with some nasty content on a share on a compromised computer. The contents of the script have been very simply obfuscated to base64. I'm going to get the base64 string from a remote share and assign it to a variable. I'm then going to convert it back to normal and use Invoke-Expression to run the payload.

$Command=Get-Content'\\halocli1002\C$\AV_Test\AMSI2.ps1'

$NewCommand=[system.Text.Encoding]::Unicode.GetString([Convert]::FromBase64String($Command))

Invoke-Expression$NewCommand

 

Here's what happens:

 

"The script contains malicious content and has been blocked by your antivirus software." 

 

Nice. The decoded, in-memory payload was passed to AMSI prior to execution by the script host.

Here's the detected threat:

 


Note the Resources property.

PS - AMSI isn't just for PowerShell or script engines. It's designed to be used with... well, whatever wants to make use of it!

 

End of Support is coming for older versions of Internet Explorer

$
0
0

Written by Harry Eagles, Editor, TechNet UK Blog

 

Beginning 12th January 2016, only the latest version of Internet Explorer available for a supported operating system will receive technical support and security updates. It is recommended that those running older versions of Internet Explorer upgrade to the most recent version – Internet Explorer 11on Windows 7, Windows 8.1 and Windows 10. For information on the latest supported version of Internet Explorer on other Operating Systems please see the table below:

Why should I upgrade?

Internet Explorer’s latest version boasts increased performance, improved security, better backward compatibility and elevated support for the modern technologies that power today’s websites and services. Outdated browsers represent a major challenge in keeping the Web ecosystem safer and more secure, as modern web browsers have better security protection. Internet Explorer 11 has features like Enhanced Protect Mode to help keep your customers safer.

What happens if I don’t move to the latest browser by End of Support on 12th January 2016?

An End of Support event means there will be no more security updates, non-security updates or online technical content updates. It will also see the end of free or paid assisted support options for older versions of the browser.

What if my business has a dependency on an application that runs on a version of Internet Explorer that is reaching End of Support?

Fear not! There is a plethora of new features and resources developed to help you upgrade and stay current on the latest browser. To ensure that your migration of applications to current web standards is as smooth as possible, Internet Explorer 11 features Enterprise Mode – offering better backward compatibility and enabling users to run many legacy web applications.

To help customers who have a business need for using Internet Explorer 11 with Enterprise Mode, Microsoft is committed to supporting Enterprise Mode through the duration of the OS’s support lifecycle. Enterprise Mode’s backwards compatibility will also see improvements over time.

 

But what exactly is Enterprise Mode?

Simply put, Enterprise Mode is a compatibility mode that runs on Internet Explorer 11 on Windows 7, Windows 8.1 and Windows 10 devices. It lets websites render using a modified browser configuration that’s designed to emulate either Internet Explorer 7 or Internet Explorer 8, avoiding the common compatibility issues associated with web apps written and tested on older versions of the browser.

Features

Improved Web app and website compatibility - Through improved emulation, Enterprise Mode lets many legacy web apps run unmodified on IE11, supporting a number of site patterns that aren’t currently supported by existing document modes.

Tool-based management for website lists – Use the Enterprise Mode Site List Manager tool to add website domains and domain paths and to specify whether a site renders using Enterprise Mode. You can download the Enterprise Mode Site List Manager tool from the Internet Explorer Download Center (https://www.microsoft.com/en-us/download/details.aspx?id=42501).

Centralised Control – You can specify the websites or web apps to interpret using Enterprise Mode, through an XML file on a website or stored locally. Domains and paths within those domains can be treated differently, allowing granular control. Use Group Policy to let users turn Enterprise Mode on or off from the Tools menu and to decide whether the Enterprise browser profile appears on the Emulation tab of the F12 developer tools

Important – All centrally-made decision override any locally-made choices

Integrated browsing – When Enterprise Mode is set-up, users can browse the web normally, letting the browser change modes automatically to accommodate Enterprise Mode sites.

Data gathering – You can configure Enterprise Mode to collect local override data, posting back to a named server. This lets you “crowd source” compatibility testing from key users; gathering their findings to add to your central site list.

 

Browser Migration Resources

There are numerous online support resources that include all the information you need to migrate to the latest version of Internet Explorer:

Internet Explorer TechCenter– The Internet Explorer TechNet site includes technical resources to deploy, maintain and support IE. Enterprise Mode for Internet Explorer 11 is covered in detail, to help customers extend Web app investments by leveraging this new backward compatibility feature.

Microsoft Assessment and Planning (MAP) Toolkit– This is an agentless inventory and planning tool that can assess your current browser install base.

クラウド プラットフォーム リリースのお知らせ - 2015 年 10 月 14 日

$
0
0
このポストは、10 月 15 日に投稿された Cloud Platform Release Announcements for October 14, 2015 の翻訳です。 こちらのブログでは、クラウド プラットフォーム チームが進める一連の新しい更新をまとめて紹介します。 マイクロソフトは、現在の "モバイル ファースト、クラウド ファースト" の世界で、エンタープライズでのクラウド文化の採用を可能にするテクノロジとツールを提供します。当社の差別化されたイノベーション、包括的モバイル ソリューション、および開発者ツールが、クラウド ファースト時代の真の可能性を実現するうえで、すべてのお客様を支援します。 クラウドでの迅速なイノベーションを期待されるお客様にお応えして、マイクロソフトは幅広い クラウド プラットフォーム 製品ポートフォリオを提供しています。お客様に最新情報を提供するため、以下の一覧に当社の最新リリースをまとめました。また、より多くの情報を必要とされるお客様のために、詳細情報へのリンクも示しています。今回の更新内容は次のとおりです。...(read more)

NTFS vs. ReFS w Windows Server 2016 (i dyski "fixed")

$
0
0

Jakiś czas temu napisałem Wam o tym Czym jest ReFS i czy zastąpi NTFS. Był to post, który miał zrobić wprowadzenie do tego co dziś chciałbym Wam przekazać.

Pisałem wtedy, że chciałbym Wam pokazać jak drastycznie ReFS może przyspieszyć pewne operacje, które z NTFS mogą trwać ponad 20 minut, a z ReFS i Windows Server 2016 TP3 trwają teraz... sekundę :)

Dyski wirtualne

Jeśli korzystacie z wirtualizacji to na 100% wykorzystujecie dyski wirtualne. Pozwalają one przechowywać dane w znormalizowanym formacie i wykorzystywać te dane w maszynach wirtualnych (choć nie tylko!)

I zapewne wiecie też, że dyski wirtualne mają kilka formatów - stałe, dynamiczne czy pass-through. O typach dysków, ich wadach i zaletach pisałem posta ("Typy dysków w maszynach wirtualnych") jeszcze w 2008 roku i bardzo zachęcam do jego przeczytania.

Dziś natomiast chciałbym Wam pokazać co robi ReFS z dyskami fixed.

Fixed Disks i ReFS

Dyski typu fixed-size używają tyle fizycznie miejsca na dysku maszyny hosta, ile określimy przy jego tworzeniu. Czyli jeśli stworzymy dysk o wielkości 100GB to musimy mieć od razu tyle miejsca na fizycznym dysku. Bo po stworzeniu takiego dysku od razu tworzony jest plik VHD(X) o wielkości 100GB.

I w przeciwieństwie do dysków dynamicznych, te nie rozszerzają się – co za tym idzie nie ma problemów z fragmentacją dysku hosta. A dodatkowo ze względu na to, że przed każdym zapisem na dysku nie musi on być w razie potrzeby powiększany – to i wydajność jest generalnie lepsza.

Dyski fixed warto też wykorzystywać jeśli chcemy mieć relatywnie prostą infrastrukturę i wolimy mieć pod kontrolą potencjalnie rozrastające się maszyny.

Dyski fixed w Windows Server 2012

Być może wiecie, że jeśli tworzyliście dysk VHD(X) w Windows Server 2012 R2 to tworzył się on naprawdę długo. Było to bardzo denerwujące dla naszych Klientów, ale jednocześnie mieli oni potrzeby korzystania z tego typu dysków.

Dyski fixed w Windows Server 2016 (z ReFS)

Odpowiedzią jest właśnie Windows Server 2016 z systemem plików ReFS na dyskach. W tej konfiguracji szybkość tworzenia dysków fixed wzrosła dramatycznie.

Test 1: Dysk fixed VHDX o wielkości 100GB

Podczas testów na laptopie Lenovo W510 (Core i7, 16 GB RAM) z drugim dyskiem twardym (500 GB 7200 RPM HD podpięty przez SATA), który był sformatowany i całkowicie pusty - stworzony został stały dysk VHDX o wielkości 100 GB.

Systemem na którym to zrobiono były Windows Server 2016 Technical Preview 3, który zainstalowany był na głównym dysku, więc nie miał wpływu na testowy dysk fizyczny. Sama maszyna nic innego nie robiła i nie miała żadnych obciążeń.

Testy były zrobione przy różnych wielkościach block size (od 4k do 64k dla 2016 z NTFS i 4k i 64k dla 2016 z ReFS). Po każdym teście dysk był formatowany używając nowego rozmiaru bloku i system był restartowany przed kolejny testem.

A oto poniżej wyniki:

Block Size

Windows Server 2016 NTFS

Windows Server 2016 ReFS

4k (default)

18:08

<1 sekundy

8k

18:07

N/A

16k

18:09

N/A

32k

18:03

N/A

64k

18:05

<1 sekundy

To robi naprawdę duże wrażenie - czas tworzenia dysku spadł z około 18 minut do... 1 sekundy!

Test 2: Dysk fixed VHDX o wielkości 400 GB

W tym teście jest dokładnie ten sam sposób działania tylko teraz dla dysku 400 GB. I dla uproszczenia i zaoszczędzenia czasu - test tylko dla block size 4k i 64k.

Block Size

Windows Server 2016 NTFS

Windows Server 2016 ReFS

4k (default)

1:23:27

~1 sekunda

64k

1:23:24

~1 sekunda

Czy trzeba coś więcej dodawać? Zejście z czasem z prawie 1.5 godziny do 1 sekundy to naprawdę ogromna zmiana.

Co z dyskami VHD?

Być może się zastanawiacie co z moimi dyskami VHD, których być może potrzebujecie do kompatybilności z Microsoft Azure? Mam tu bardzo dobrą wiadomość - dyski VHD mają praktycznie takie same możliwości jak VHDX. Ale to możecie sprawdzić już sami :)

Aha - pamiętajcie, że my bardzo mocno sugerujemy wykorzystanie dysków VHDX (a nie VHD) ze względu na ich większą odporność na awarie, wsparcie do 64 TB (per dysk) oraz dużo wydajniejsze działanie przy wykorzystaniu snapshotów. No i oczywiście natywnie wspierają dyski 4k.

Mam nadzieję, że spróbujecie sami wykorzystać ReFS w swojej infrastrukturze :)

Helping build Saudi Arabia’s future in the cloud

$
0
0

Posted by Abdulla Al Uzaib, Marketing & Communication Lead, Microsoft Arabia

 In 2006, when Saudi Arabia had yet to embrace the cloud, co-founder of Virtual Vision (V2), Hazem Sandouka, had the foresight to see that this technology was the way of the future – and the company was born. Working with Microsoft, the company began by focusing on providing migration, implementation and consultation, while developing relationships to accelerate the business.

With its forward-thinking vision, V2’s move to the cloud is not only helping the company grow, but it’s also helping Saudi Arabia move forward in a mobile-first, cloud-first world.

The team’s expertise, along with their partnership with Microsoft, will see V2 opening a cloud datacentre for the country in 2016, which Sandouka believes will be one of the best products running in the market. The company really is an integral part of building Saudi Arabia’s future.

To read more about V2, click here.

Slik får du 32 000 mennesker til å selge ditt produkt- Velkommen til Global Sales Day for programvarehus

$
0
0


Microsoft ønsker å samle alle de viktigste programvarehusene i vårt økosystem for en spesiell ISV dag med en spesiell agenda. Vi får storfint besøk fra vår globale salgssjef som har ansvar for 32 000 selgere. Jean-Philippe Courtois ønsker å fremheve vårt fokus rundt programvarehus og hvordan hans salgsstyrke kan selge ditt produkt til kunder over hele verden.

Vi har som overordnet mål å synliggjøre våre partneres produkter ute i verden. Hør fra President International Jean-Philippe, Country GM Michael Jacobs, storkunde-ansvarlig i MS Cecilie Vanem og fremgangsrike Norske programvarehus hvordan man pakker og selger produkter til den globale markedsplassen.

Dato: 27.oktober

Tid: 09.30-14.00

Sted: Lysaker Torg 45

Dagens innhold:

Microsoft Norges Teknologidirektør, John Henrik Andersen ønsker velkommen.

President Microsoft International, Jean-Philippe Courtois ledet Microsofts salgsorganisasjon da vi gikk fra en omsetning på 370MRD kroner til 790MRD kroner, med en sunn bunnlinje som har generert 1461MRD kroner i overskudd på 10 år. Hvordan kan hans 32 000 medarbeidere være en viktig ressurs for din fremtidige vekst og suksess også?

Country GM Norway, Michael Jacobs ønsker som leder av Microsoft Norge å fortelle hvorfor programvarehus er viktig og hvorfor han personlig engasjerer seg.

Enterprise Group Lead, Cecilie Vanem, snakker om viktigheten av å drive produktinnovasjon og hvordan pakketering av løsninger kan selges til hennes største kunder og resten av verden.

Partnere, vi hører med sett sett partnere som er på god vei ut i det globale markedet. De deler sin historie og hvordan Microsoft var en viktig ressurs i deres arbeid.

Business Advisor, Marius Dahl har bred erfaring innenfor pakketering av produkter og løsninger for salg. Han vil dele sine tanker og erfaringer rundt salg- og markedsstrategi og hvordan dere best kan pakketere løsninger for globalt salg.

Business Developer, Arif Shafique, har ansvaret for norske programvarehus. Han viser hvordan den lokale avdelingen i Norge jobber med å ta norske programvarehus ut i verden.

 

Velkommen til dagen som forandrer måten du gjør forretning på!

Mvh

Microsoft Developer Experience, Microsoft Norge AS


Microsoft FastTrack: お客様を成功に導くサービスへと進化

$
0
0

(この記事は 2015 年 10 月 7 日に Office Blogs に投稿された記事 The evolution of Microsoft FastTrack—the customer success serviceの翻訳です。最新情報については、翻訳元の記事をご参照ください。)

 

この 1 年間、マイクロソフトはさまざまな難易度の IT に関する課題を抱えるお客様が Office 365 に移行し、実装コストの削減、価値実現までの期間の短縮、優れた顧客エクスペリエンスの提供を実現できるようにサポートしてまいりました。

お客様に最も優れたエクスペリエンスを提供するための一環として、今回、お客様からのフィードバックを参考に、FastTrack を今までの単なるオンボーディング サービスではなく、マイクロソフトのクラウドからもっと短期間でビジネス価値を創出することを目的としたお客様を成功に導くサービスへと進化させることを発表いたします。

第一に、FastTrack が 1 回限りの特典から継続的な特典に変更されます。これにより、サブスクリプションの契約期間中、必要なときに何回でも新規ユーザーおよび機能のオンボーディングのサポートをご依頼いただけるようになります。

ま た、FastTrack が拡張され、ビジョンの作成、オンボーディング、ビジネス価値の創出のすべての段階において、カスタマイズされたサポートやリソースが提供されるようにな ります。これは、今年に入ってから FastTrack オンボーディング サービスに Enterprise Mobility Suite (EMS)、Azure Active Directory Premium、Intune、Azure Rights Management といった製品が新たに加わったことに伴い実施されるものです。

お客様を成功に導く FastTrack のアプローチ

新しいレベルのサービスを提供

FastTrack では、新しい Web サイト (FastTrack.microsoft.com (英語)) および FastTrack Center から、ベスト プラクティス、ツール、リソースのほか、ニーズに合わせたリモート アシスタンスを提供しています。FastTrack Center は、今日まで世界中の IT プロフェッショナルやパートナーの皆様にオンボーディングと移行に関するリモート アシスタンスを提供してきた何百人ものエンジニアから構成されるチームです。今後は、Web および FastTrack Center での各分野におけるサービスが以下のとおり拡張されます。

  • ビジョンの作成FastTrack.microsoft.com (英語)では、お客様が無事にオンボーディングと移行を完了できるように、Office 365、EMS、Azure といったサービスに関する技術的な実装戦略やユーザー浸透率向上の戦略などのカスタムの計画を作成するためのリソースとツールをご提供します。
  • オンボーディング— 準備が完了したら、オンボーディングをご依頼ください。お客様のニーズに合わせたリモート アシスタンスを通じて、FastTrack エンジニアがお客様の技術環境を評価し、お客様の IT スタッフまたはパートナーと協力してオンボーディングおよび移行をスムーズに実施できるようにサポートいたします。
  • ビジネス価値の創出— マイクロソフトの目標は、お客様が IT 投資を最大限に活用できるようにお手伝いすることです。そのために、Office 365 のユーザーへの浸透率向上に関するベスト プラクティスやガイダンス、リソースを提供してお客様の利用開始をご支援するほか、必要に応じてさらに詳細なサービスを提供する認定パートナーを紹介いた します。また、既存の IT プラクティスを刷新し、変更を効果的に管理するためのツールやガイダンスもご利用になれます。

ぜひご利用ください

新しい FastTrack を皆様に活用していただければ幸いです。Office 365 を初めてご利用になる場合も、順調に移行されている場合も、FastTrack に関する以下のリソースを確認することをお勧めいたします。

  • 今すぐご利用を開始するには、FastTrack.microsoft.com (英語)にアクセスしてください。
  • 今後の詳しい更新予定については、Office 365 ロードマップ (英語)をご確認ください。
  • 10 月 13 日 (火) 午前 9 時~ 10 時 (太平洋夏時間) / 10 月 14 日 (水) 午前 1 時~ 2 時 (日本時間) に Office 365 Network (英語)が開催する YamJam にご参加ください。FastTrack のメリットを確認し、リアルタイムで質問することができます。

—Arpan Shah (Office 365 チーム、シニア ディレクター)

 

よく寄せられる質問

Q. FastTrack を利用するための条件を教えてください。

A.FastTrack.microsoft.com (英語)で公開されているリソースやベスト プラクティスは、すべてのお客様にご利用いただけます。さらに、Office 365 Enterprise SKU、有料の Government および Education SKU、Kiosk SKU、Nonprofit SKU のいずれかを 150 ライセンス以上購入されたお客様は、FastTrack Center もご利用になれます。

Q. 新しい FastTrack の提供開始時期を教えてください。

A.対象となるお客様は、FastTrack Center から今すぐに導入計画サポート、オンボーディング サービス、移行サービスを利用することが可能です。

Q. 最新情報を入手する方法を教えてください。

A. FastTrack は進化し続けるサービスです。今後の更新予定については、Office 365 ロードマップで公開いたします。roadmap.office.com (英語)にアクセスして、現在予定されている更新の一覧をご確認ください。

Q. FastTrack Center で提供されるオンボーディング サービスと移行サービスの内容を教えてください。

A. 詳細については、サービス内容をご確認ください。

Q. FastTrack が提供される地域と対応言語を教えてください。

A. FastTrack はすべての地域でご利用いただけます。FastTrack Center では、英語、ポルトガル語 (ブラジル)、フランス語、ドイツ語、イタリア語、日本語、スペイン語、繁体字中国語によるリモート アシスタンスを提供しています。また、FastTrack.microsoft.com (英語)は、リリース時点 (2015 年 10 月 7 日) で英語のみをサポートしています。11 月には上記の各言語でコンテンツがリリースされる予定です。

Q. お客様やパートナーは、Microsoft FastTrack for Office 365 に関する詳細情報をどこから入手できますか。

A. FastTrack のメリットについては、FastTrack.microsoft.com (英語)および Microsoft FastTrack for Office 365 のサービス内容をご覧ください。また、パートナー様はこちらのページから FastTrack がもたらすビジネス チャンスについてご確認いただけます。

Calling all African student developers: It’s time for Imagine Cup 2016

$
0
0

Posted by Amnitas Neto 

It’s no surprise that Microsoft believes in the power of young people connecting with technology to bring their bold ideas to life – after all, the company was originally founded by students. That’s why, for the past 13 years, Imagine Cup has been inspiring student developers around the world to create innovative solutions that change the way we live, work and play, while also growing the skills they need to pursue a future in technology. Now that the 2016 season is underway, I strongly encourage you to get involved.

I don’t have to tell you about the enormous talent coming out of the African continent. In the 2015 season of Imagine Cup, three teams of African students made it to the World Finals, where they competed against 30 other global teams.

Team Digital Interactive Games from South Africa was a finalist in the Games category, with its project ‘PYA Maze of Gods’. Their 3D game is built to challenge the user’s problem-solving skills, reaction time and ability to overcome obstacles. Also in the Games category, team T2 from Tunisia’s ‘Back in Time’ game sees parents and their children competing against each other using their mobile devices and PCs. Team LifeWatch from Nigeria was recognised in the World Citizenship category for their AsthmaVisor solution. It is made up of a wearable device and a mobile app and is geared towards a more cost-effective and efficient way of supervising asthmatic patients.

Although these teams didn’t end up winning at the World Finals, the experience they gained was invaluable. From travelling to Seattle, Washington to present their projects, to developing new skills and learning all about collaboration, they’re well equipped to take the next step in their technology careers.

And who knows – maybe the 2016 winner will hail from our continent. You could join the ranks of 2015 Imagine Cup World Champions, Team eFitFashion of Brazil. Their project, ‘Clothes For Me’, which is a marketplace for custom tailored clothes based on a person’s unique body size and shape, won them $50 000, a Microsoft Ventures Bootcamp and a private meeting with Microsoft CEO, Satya Nadella.

The 2016 season of Imagine Cup has kicked off. It’s never too early for you to start dreaming up your project in the Games, Innovation of World Citizenship category – and show the rest of the world what African students are made of.

Visit www.imaginecup.com for more information.

Afgørelse af EU-Domstolen om overførsel af persondata

$
0
0

I sidste uge afgjorde EU-Domstolen, at Safe Harbour-aftalen mellem USA og EU om overførsel af persondata var ugyldig.

Som kunde eller partner er det kun naturligt at være bekymret omkring hvilken effekt det eventuelt kan have for dine løsninger hos Microsoft.

Det er vores klare overbevisning at alle de nødvendige juridiske og tekniske kontrakter, aftaler og processer er på plads for fortsat at behandle dine data og sikre eventuel dataoverførsel også til USA – og også nu hvor Safe Harbour pr. 6. oktober, 2015 er erklæret ugyldig.

Microsoft har længe sikret implementering af yderligere og streng beskyttelse af personhenførbare og personfølsomme oplysninger gennem bl.a. EU Model Clauses, konsekvent overholdelse af en række internationale standarder for sikkerhed og privacy og ikke mindst naturligvis operationelle procedurer og fx. teknologier som konstant eskallering af vores brug af kraftig kryptering.

Læs mere om Microsoft Danmarks kommentarer til EU-domstolens dom om Safe Harbor-principperne her.

Af Ole Kjeldsen, Sikkerheds og Teknologidirektør

DDI survey

$
0
0
Folks, We, the networking team at Microsoft, want to engage with you to make our products serve you better. For this we are conducting a survey to know more about the DNS, DHCP and IP address analytics scenarios that you have in your organization. It will be great if you can complete the following survey: https://microsoft.qualtrics.com/SE/?SID=SV_bf7xtW8v1O3aJlX...(read more)

A Snippet of Security

$
0
0

Hey Everyone, further to my post on Group Policy and AD last week I want to follow up with a post on Security.

When I’m out on site with customers doing risk assessments we always see the same risks being raised, mainly about the following topics:

  • Servers, especially DCs not being patched. For example we see MS14-068 missing on domain controllers in many customer environments. This patch (among others) is critical and was released in late 2014 – nearly 1 year ago. Please check your environment for missing patches.

  • Too many members in the highly privileged admins groups – this one always flags and is down to there being too many people in Enterprise Admins and Domain Admins groups permanently or service accounts being a member of these groups

  • High privilege users have password never expires set

  • Stale user and computer accounts – very easy for attackers to compromise these accounts and stay unnoticed in an environment

  • No configuration baseline or standard build for servers, especially DCs, therefore allowing unneeded, unwanted and potentially malicious software to run

  • Allowing internet access from servers, particularly DCs

  • Security baselines and basic security configuration not being done, such as restricting logon types to service accounts, restricting RDP access, other User Rights Assignment not being configured and the advanced firewall not being configured

It may seem too hard to solve some of these issues, however locking down these areas will significantly improve your overall security. It’s still a cat and mouse game but you should be trying to make this as hard as possible for the attackers and you really do need to make the investments financially and in the resource to implement and maintain these things.

 

This post will have links to Microsoft’s current recommended practice when it comes to security, mainly focussed on the identity space. Again, this post may evolve over time so please use as a reference point and check back.

 

A Snippet of Security as it relates to identity:

 

Immutable laws of Security

The following two links provide the 10 immutable laws of security V1 and V2, some great reference material for your CIO:

https://technet.microsoft.com/library/cc722487.aspx

https://technet.microsoft.com/en-us/library/hh278941.aspx

 

Also, see this link for some general links around Windows Security:

https://technet.microsoft.com/en-us/library/windows-server-security.aspx

 

Active Directory

The bible when it comes to AD security is the “Best Practices for AD Security” whitepaper written by MSIT and Microsoft consultants. This document includes all of the best practices for proactively managing and securing your directory based on experience from several security CritSits, AD Security Assessments and breaches of Microsoft customers. You should be looking to align your environment as best you can with these recommendations:

http://www.microsoft.com/en-gb/download/details.aspx?id=38785

Another great document to reference is the Threats and Countermeasures guide:

https://technet.microsoft.com/en-us/library/hh125921(WS.10).aspx

 

Credential Theft

Credential theft became a big problem a few years ago when Pass the hash attacks became prevalent. Although PtH has been around for many years it is only in recent years that tooling has been feely available that makes these attacks trivial to undertake. Although the PtH “vulnerability” cannot be fixed with an update there are plenty of mitigations to put in place and these are documented in the excellent PtH whitepapers which are a must read for any security admin:

https://www.microsoft.com/en-gb/download/details.aspx?id=36036

It should be noted that by mitigating these risks isn’t a panacea for security as the credential thieves will just use other techniques, but it does help address the low hanging fruit. There I said it, low hanging fruit J. Let’s see what other buzz terms I can drop in…

It’s also worth checking out the new Windows 10 feature call Credential Guard which also helps mitigate credential theft, again not a panacea but certainly raises the bar. There’s another one J

https://technet.microsoft.com/en-us/library/mt483740(v=vs.85).aspx

 

Accounts Security

Managing the security of accounts such as service accounts and accounts is paramount to the environments security as these accounts are usually high value targets for attackers. When using these accounts, the principle of least privilege should be used. Some guidance does exist on this and can be found here for service accounts:

https://technet.microsoft.com/en-us/library/cc170953.aspx

And here for admin accounts:

https://technet.microsoft.com/en-us/library/cc162797.aspx

When looking at service accounts it is also worth seeing if you can invest into Group Managed Service Accounts, introduced with Server 2012:

https://technet.microsoft.com/en-GB/library/hh831782.aspx

Lastly, check out the Best Practices for Delegating Active Directory Administration guide:

http://www.microsoft.com/en-gb/download/details.aspx?id=21678

 

PAM/JIT

Privileged Account Management and Just in Time elevation is quite big at the minute and managing accounts with high levels of access in the environment (either admin or VIP accounts) is crucial to maintaining good security posture. Therefore, Microsoft have implemented a PAM feature into Windows Server vNext and JIT feature into MIM 2016. Check this post for info on this:

https://technet.microsoft.com/en-us/library/dn903243.aspx

Also, our very own PoSH Chap Ian Farr has a good post on JIT using PowerShell and some of the built in features on Windows Server 2012:

http://blogs.technet.com/b/poshchap/archive/2015/05/29/latest-hsgb-outings-just-in-time-jit-administration.aspx

 

Proactive Monitoring and Detection

One of the key aspects of security is to perform monitoring, but also have the correct alerts in place tuned for your environment so you can use them to detect any attack or compromise. This guide will help you create this:

https://technet.microsoft.com/en-us/library/cc163158.aspx

 

Defining a Security Baseline:

You should have a security baseline in your environment for DCs, member servers and workstations such that you are blocking some legacy security protocols, blocking certain logon types for certain users, configuring the Windows firewall and services such as AppLocker. The Security Compliance Manager tool can help you with this:

https://technet.microsoft.com/en-gb/solutionaccelerators/cc835245.aspx

 

PKI

As you know from reading my blog PKI is my passion and the security around PKI including designing key signing ceremonies and secure implementations is what I specialised in before joining Microsoft. I’ve touched on some of the security aspects of PKI in my non-repudiation and offline CA virtualisation posts here:

http://blogs.technet.com/b/paranoidhumanoid/archive/2015/10/04/non-repudiation.aspx

http://blogs.technet.com/b/paranoidhumanoid/archive/2015/09/25/should-i-virtualise-my-root-ca.aspx

Further to my postings there is also the Securing PKI whitepaper which is very good:

https://technet.microsoft.com/en-us/library/dn786443.aspx

 

There are plenty of other reference guides and links that you can use, however the above will give you a starter for ten and plenty of bed time reading!

Viewing all 36188 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>