Quantcast
Channel: TechNet Blogs
Viewing all 36188 articles
Browse latest View live

International Update: A great meeting in the Brazilian Community

$
0
0

In the last week, some Brazilian TechNet Wiki authors gathered to celebrate all work carried out in the past months and discuss new ways to share knowledge about different Microsoft technologies.

I have to admit, I'm very happy with the new possibility of growth that we have in the coming months, because we have some good ideas to implement on Azure, Windows 10 and also other technologies.

 

 

In the picture, we can see some of these authors (from right to left):

The new Alan Carlos (after losing 154 lbs. Amazing!!!), Luciano LimaDurval Ramos aka me!!! =^) and Vinicius Mozart.

See below some of the articles written by these authors (in Portuguese language):

 Alan Carlos

- ALM - Test Manager - Criando um Plano de Testes (End-to-End)

- ALM - Testes - Visual Round Trip Analyzer

- ALM e Operações de TI - Gestão 360 com o System Center Operations Manager em 06 Passos


Luciano Lima

Reduzindo a Exploração de Vulnerabilidades com o EMET 5.2

Explorando a Console do WSUS Server 4.0

Como Manter Seu Computador Atualizado e Seguro


Durval Ramos

- Caçando Registros Fantasma no SQL Server

- Paginando uma Consulta com SQL Server

- SSIS - Manipulando Eventos com "OnError" ou "OnTaskFailed"


Vinicius Mozart

- Inventário de Rede com MAP (Microsoft Assessment and Planning Toolkit)

- Detectando Erros (The WMI Diagnostics Utility)

- Serviço de Gerenciamento no Microsoft Azure (E-mail Alerts)

 

It's great to know a little more about work to Brazilian Community.

I hope this meeting will provide an important opportunity to many other members also share knowledge around the world through our TechNet Wiki.

 

See you soon here !

Brazilian Wiki Ninja Durval


SharePoint 2013 Distributed Cache recommendations

$
0
0

Shortlist on recommendations/remarks when it comes to AppFabric for SharePoint 2013:

  • Might sound strange, but don't disable/remove the Distributed Cache Cluster on your Farm even if it for instance performs better without, solve the issue instead.
  • Before disconnecting a server from the farm, make sure that it is not part of the AppFabric Cluster, as it will break it.
  • Install the latest AppFabric CU and make sure the Garbage Collection section is added to the config file.

Latest at time of writing:
https://support.microsoft.com/en-us/kb/3042099
GC:
http://blogs.msdn.com/b/calvarro/archive/2014/03/20/points-to-consider-with-distributed-cache-on-sharepoint-2013.aspx

  • There are articles out there that tell you it is ok to perform unsupported procedures.

If you want to use AppFabric for custom caches, don't use the SharePoint Cluster.
Do not run AppFabric Tools for managing your AppFabric Cluster or Hosts, only use Distributed Cache PS commands.
The AppFabric "Windows" service should not be touched.
As with almost everything on DB level in SharePoint, don't touch the Cluster Details in the config DB.
The AppFabric Export-CacheClusterConfig functionality should not be used to manipulate the Cluster nor Hosts unless verified by MS Support, there are "very" few reasons to do so anyway, as most can be done through other ways.

  • Change the values for the caches from day 1 as recommended, if need be increase even more.

https://technet.microsoft.com/en-us/library/jj219613.aspx#finetune

  • Run AppFabric on WFE's (unless they are struggling to keep up), not Application servers, if possible use dedicated Cache Hosts.
  • Don't run AppFabric on servers that have services running that will compete for HW resources.

Can contradict with the previous statement, but if you are in a situation where WFE's are overloaded and you have a limited amount of App servers, I would really consider adding dedicated servers.
https://technet.microsoft.com/en-us/library/jj219572.aspx

  • Increase memory allocation sufficiently based on MS recommendation or even more if you see the Caches are overloaded.

https://technet.microsoft.com/en-us/library/jj219613.aspx#memory

  • Don't go overboard with the number of Cache Hosts, having 10 WFE's does not mean you need 10 Cache Hosts, 2-3 will be enough in most cases.
  • Use the script on Technet for a Graceful shutdown of a Host, don't forget to run Use-CacheCluster before ;)

https://technet.microsoft.com/en-us/library/jj219613.aspx#graceful

  • After doing maintenance to Cache Hosts, check that all are healthy and match memory size and service acccount.

Use-CacheCluster
Get-CacheHost
Export-CacheClusterConfig -Path C:\export.txt

  • When virtualizing your SharePoint servers make sure you are not using Virtual Memory.
  • When you ran into an issue and the Cluster is in an unhealthy state from which you simply can't recover due to for instance unsupported actions, rebuild your Farm instead of trying to fix it manually.

https://technet.microsoft.com/en-us/library/jj219613.aspx
The Distributed Cache service can end up in a nonfunctioning or unrecoverable state if you do not follow the procedures that are listed in this article.
In extreme scenarios, you might have to rebuild the server farm.
The Distributed Cache depends on Windows Server AppFabric as a prerequisite.
Do not administer the AppFabric Caching Service from the Services window in Administrative Tools in Control Panel.
Do not use the applications in the folder named AppFabric for Windows Server on the Start menu.

  • Solving an issue with AppFabric usually comes down to going over your list of recommendations, known issues and misconfiguration, if the issue persists after that, well ... Houston.
  • I'd highly recommend to make sure your Cluster is configured best practice before starting to troubleshoot, it can save you a lot of pain and effort.

 

Related articles:

http://blogs.technet.com/b/filipbosmans/archive/2015/01/04/troubleshooting-distributed-cache-for-sharepoint-2013-on-premise.aspx

http://blogs.technet.com/b/filipbosmans/archive/2015/09/07/how-to-check-for-issues-with-distributed-cache-and-the-script.aspx

Use PowerShell to Parse Network Trace Logs—The Video

$
0
0

Summary: Ed Wilson, Microsoft Scripting Guy, presents a video to show how to use Windows PowerShell to parse network trace logs.

Microsoft Scripting Guy, Ed Wilson, is here. Today I am presenting a video where I show how to use Windows PowerShell to parse network traces. The steps I show are common commands that will normally be run to set up and collect network tracing. I will talk about the following:

  • Importing the log
  • Parsing the log

   Note  For more information about this technique, see Use PowerShell to Parse Network Trace Logs. Also check out yesterday's video, Use PowerShell to Collect Network Traces.

Here is the video:   

(Please visit the site to view this video)

Here is a link to the video from YouTube if you would like to download it or play it offline in a different video player:  Use PowerShell to parse a network trace by Ed Wilson.

I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

Ed Wilson, Microsoft Scripting Guy 

#findingproductivity: Produktivität im 21. Jahrhundert heißt nicht „mehr arbeiten“, sondern „mehr wissen und teilen“

$
0
0

Wir haben uns auf der Digitalkonferenz NEXT15 im Rahmen des Reeperbahn Festivals am 24. und 25. September 2015 in Hamburg mit konkreten Fragen des digitalen Wandels beschäftigt und die Frage gestellt „Wie werden wir künftig wohnen, leben, arbeiten?“

In der Diskussion um die Arbeit der Zukunft spielt Produktivität eine große Rolle – ein Begriff, der sich durch die Digitalisierung stark verändert hat und neu verortet werden muss. Im 19. und 20. Jahrhundert stand Produktivität dafür, den Output durch besseren Input zu erhöhen. Also: Es ging darum, aus den Leuten mehr Leistung für bessere Ergebnisse herauszupressen. Kein Wunder, dass von diesem Begriff kein echter Zauber ausging.

Heute ist das anders: Produktivität hängt – neben der Automatisierung der Produktion – vor allem davon ab, wie wir unser Wissen nutzen und teilen. Ich habe auf der #NEXT15 u.a. über die neue (Neu-) Definition von Produktivität gesprochen:

Wir haben den Dialog über Produktivität auf der #NEXT15 begonnen und setzen ihn fort. Mehr als 400 Wissensarbeiter haben uns kürzlich verraten, was Produktivität für sie bedeutet, unser Dossier mit 16 Thesen liefert zusätzlich Material, den Diskurs zu #findingproductivity und eine neue Formel für Produktivität weiter zu tragen.

 

 

 

Ein Beitrag von Dr. Thorsten Hübschen (@ThorHuebschen)
Verantwortlich für das Office Geschäft bei Microsoft Deutschland.

PowerTip: Use PowerShell to Display Number of Minutes in a Day

$
0
0

Summary: Use Windows PowerShell to display the number of minutes in a day.

Hey, Scripting Guy! Question How can I use Windows PowerShell to easily display how many minutes are in a day?

Hey, Scripting Guy! Answer Create a timespan equal to one day, and then select the total minutes from it:

(New-TimeSpan -Days 1).totalminutes

Developing Web Applications using ASP.NET 5 (beta 7) on Ubuntu Linux 14.04.2 LTS Part 1 – Installing & Configuring

$
0
0
By: Nestor Guadarrama With the new version of ASP.NET 5, web developers requires change their web development paradigms working on ASP.NET. This version was totally redesign in order to let ASP.NET being more open. This means basically that now, ASP.NET 5 is considered a new open-source and cross-platform framework for building modern cloud-based Web applications using .NET. Microsoft built it from the ground up to provide an optimized development framework for apps that are either deployed to the...(read more)

顧客ライフサイクル全般を担うことが収益向上につながる 4 つの理由 【10/18 更新】

$
0
0

(この記事は 2015 年 9 月 2 日に Microsoft Partner Network Blog に掲載された記事 4 ways owning the entire customer lifecycle makes you more profitable の翻訳です。最新情報についてはリンク元のページをご参照ください。) 

「クラウドへの移行は、収益向上につながります。」これまで幾度となく耳にした言葉かもしれませんが、いまだ数多くのパートナー様がせっかくの大きなチャンスを目前にしながら二の足を踏まれている状況です。今回は、この言葉について改めて考えてみたいと思います。2014 年度にマイクロソフトが IDC 社に委託して実施した調査* によると、マイクロソフト クラウド パートナーとして高い実績を収めている各社は 1.4 倍の増収を達成しており、さらに注目すべき点として粗利益も 1.5 倍増を達成しています。また、収益が増えただけでなく、利益をより長期にわたって継続して得られるようになった点も見逃せません。クラウド パートナーの皆様は、マネージド サービスやパッケージド IP からの経常収益によって粗利益を拡大しており、製品やプロジェクト サービスによる利益は減少傾向にあります。ここに挙がったポイントはすべて、クラウドベースの企業がきわめて高く評価される一因となっています。ほぼすべての指標が、ビジネス向上の最良の方策としてクラウドを指し示していることを考えれば、パートナーの皆様がクラウド活用に着手できるように支援させていただきたいと思わずにはいられません。

 

Cloud Solution Provider プログラム

Cloud Solution Provider (CSP) プログラムは、パートナー様がマイクロソフトのクラウド サービスと自社のサービスとを一緒に販売することで、クラウド分野での収益性を確保できるという 2 段階で構成されるシステムです。CSP パートナーになると、お客様のライフサイクル全般を担うことになるので、競争力が高まり、収益を拡大できます。

 

  1. 付加価値サービスの販売を促進: これまでは、仮にパートナー様がアドバイザー モデルを通じてサービスを販売した場合、お客様はオンライン サービスをパートナー様から購入しますが、以降の支払いやサポート関連についてはマイクロソフトと直接やり取りしていました。この方法ではお客様が 2 社のベンダーを管理しなければならず、パートナー様は購入プロセスの一部を担うだけにすぎませんでした。

    CSP の場合は、パートナー様がエンド カスタマーへの対応を担当します。課金管理やサポートの提供など、重要な業務すべてを実施することになるので、サービスを一貫した方法で包括的に提供しながら、さらにマネージド サービスやパッケージド IP といった付加価値サービスを積み重ねていけるようになりました。お客様にとっては支払いや各種のやり取りを 1 社のベンダーのみと行えるのでプロセスを簡略化できるというメリットがあり、パートナー様にとってはお客様との関係を担う責任者として存在感を高められるというメリットがあります。また、パートナー様は、自社が提供する利益率の高い高付加価値サービスについて、クロスセルやアップセルのチャンスも期待できます。
  2. マルチサービスの販売活動を強化: CSP プログラムは、マイクロソフトのあらゆるオンライン サービス (Office 365、Enterprise Mobility Suite、Microsoft Azure、Dynamics CRM Online) を対象としています。そのため、CSP パートナーは、より多くのソリューションやサービスを包括的かつ計画的に販売できるので、既存のお客様からの収益を拡大しやすくなり、収益性が高まります。
  3. 信頼できるアドバイザーとしての地位を向上: CSP パートナーは、製品、サービス、サポートに関してお客様に対応する唯一の窓口です。したがって、パートナー様はエンド カスタマーにとって信頼できるアドバイザーとしての地位を確立し、存在感を高めることができます。これにより、クロスセルやアップセルが容易になり、乗り換えのリスクが軽減されます。さらに、パートナー様が信頼できるアドバイザーとしてお客様のために真摯に取り組んでいれば、お客様が競合他社からの商談に応じる可能性はきわめて低くなります。
  4. 自動化によって大幅な効率化を実現:マイクロソフトは、CSP パートナーの皆様が自社の課金システムや ERP システムをマイクロソフトのシステムと直接統合できるように、一連の API とツールを作成しました。こうして自動化を推し進めることで、業務の効率化とコストの削減が可能になります。

 

マイクロソフトでは既に CSP プログラムを始動させ、クラウドを手掛けるパートナー様の収益向上を実現するためにご支援しています。MPN サイトに記載されている 2 種類のモデルのうち、パートナー様のビジネス環境に適したものを選択していただけます。皆様にぜひ関心をお寄せいただければ幸いです。

 

Brent Combest
@BrentCombest

*「Successful Cloud Partners 2.0 (クラウド パートナー様の成功事例 2.0)」IDC eBook、2014 年

Domingo - Surpresa - Você sabe como está distribuída a colaboração em nossa comunidade?

$
0
0
Olá amigos da comunidade Wiki Ninja.

Sejam muito bem-vindos ao nosso Domingo - Surpresa.

Nosso objetivo é divulgar a distribuição das colaborações em nossa comunidade do Brasil.



Se você ainda não conhece o TechNet Wiki visite nossa página oficial em: http://social.technet.microsoft.com/wiki/pt-br/default.aspx

Sabemos que muitos membros colaboram em diversos canais como fóruns, traduções, galerias, blogs entre outros.

Contudo queremos focar em dois tópicos importantes dentro desse universo, a publicação dos artigos de tecnologia e nos maiores contribuintes do TechNet Wiki.

Como todos sabem, ao menos uma vez no mês ocorre a divulgação do ranking internacional de contribuições no blog inglês.

Conforme último status divulgado em Friday with International Community Update – Progress in each language (Sept. 2015).

Estamos na segunda colocação geral com 4.310 artigos escritos. Um ótimo número por sinal.

Entretanto para chegar a esse valor, temos a participação efetiva de nossos membros que estão distribuídos pelos vários Estados do Brasil.

Cabe a pergunta: Quais são os Estados brasileiros que mais colaboram em nossa comunidade?

Para responder a essa pergunta, fizemos um estudo e chegamos ao seguinte resultado.

Dos 27 Estados de nosso país, 11 são os que mais colaboram conosco.

Acompanhe o ranking e a quantidade de colaborações por Estado.



Gráfico comparativo entre os Estados.



Outro levantamento realizado é que dos 115 colaboradores analisados, 22 ultrapassaram a marca de 40 artigos publicados no Wiki.

Na sequência você encontrará a lista dos 22 maiores contribuintes da nossa comunidade até o momento.



Gráfico comparativo entre eles:



Para ter mais detalhes sobre a colaboração desses membros, visite a página pessoal deles na comunidade.

Fernando Lugão Veltem

Luciano Lima [MVP] Brazil

Marcelo Strippoli

Uilson Souza - MCTS MTAC

Luiz Henrique Lima Campos [MVP]

Jordano Mazzoni - MVP

Durval Ramos

Hezequias Vasconcelos

Alan Nascimento Carlos

Vinicius Mozart

Rafael Mantovani

Erick Albuquerque

Daniel Donda

Ozimar Henrique

Caio Vilas Boas

Thiago Cardoso Luiz

Marcelo Sincic - MVP

Leandro E. Carvalho

Jefferson Castilho

Ana Paula de Almeida

Luan.Moreno A.k.a SQL.Soul

Thiago Guirotto


Com essas informações é possível perceber que o segundo lugar geral da comunidade do Brasil no ranking internacional, não é mera coincidência.

Contamos com o empenho, dedicação e apoio de vários membros da nossa comunidade para ter esses números e manter nosso lugar de destaque frente a outras comunidades.

Agradecemos imensamente a ajuda de todos os membros e esperamos contar com mais colaborações.


Obrigado pela oportunidade.

Wiki Ninja Hezequias Vasconcelos ++


Remote mailbox move fails with "You must specify the PrimaryOnly parameter"

$
0
0

You have a situation where you are performing remote mailbox move from your on-premises exchange server to Office 365 Exchange Online and  you come across this error "You must specify the PrimaryOnly parameter"

This situation occurs due to  below reasons:

1. You have your users' Primary mailbox on the On-premises exchange server and Archive is on cloud in the Hybrid setup.

2. Now you decided to move this users' Primary mailbox to cloud so that both his Primary mailbox and In-Place Archive are in cloud

3. You initiate remote mailbox move from Office 365 EAC and you choose this user for the move; Now you get error as 'You must specify the PrimaryOnly parameter'

This is because in GUI we do not have option to select only primary mailbox though we have only archive mailbox! See below:

How to get around this situation? Its easy, we will use PowerShell!

Connect to Exchange Online using below cmdlets. Run one after the other in the Windows PowerShell.

$UserCredential = Get-Credential (This will ask user name and password - Use Office 365 Global admin user name and password)

$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection

Import-PSSession $Session

Now that we are connected to Office 365 Exchange Online using PowerShell, we are all set to run the cmdlet that will help resolve the issue for us.

New-MoveRequest -Identity <user email address to move> -Remote -RemoteHostName 'your on-premises mrxproxy url' -RemoteCredential (Get-Credential) -BatchName <name of the batch> -PrimaryOnly -TargetDeliveryDomain <Office 365 mail.onmicrosoft.com domain>

Note: This will ask for 'Credential" ; Use your on-premises Exchange Admin Credential.

And we are done. The move request should be created and you can check the status of this request using below cmdlet:

Get-MoveRequestStatistics -Identity <the batch name>

Thanks and Happy Learning!

Dude It’s a String—The Video

$
0
0

Summary: Ed Wilson, Microsoft Scripting Guy, presents a video to talk about working with strings in Windows PowerShell.

Microsoft Scripting Guy, Ed Wilson, is here. Today I present a video where I talk about working with strings—in particular empty or null strings. I talk about the differences between empty, null, and white space in strings and present two static string methods that are always available to deal with the problem.

   Note  For more information on these techniques, see Dude, a String Is a String in PowerShell.

(Please visit the site to view this video)

Here is a link to the video from YouTube if you would like to download it or play it offline in a different video player: Is the string null or empty? by Ed Wilson.

I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

Ed Wilson, Microsoft Scripting Guy 

Updated ModernBiz Technical Training Events For SMB Partners Coming Soon

$
0
0

Over the coming months there will be a number of events being delivered as part of the Modern Business campaign which cover a range of Microsoft technologies including Office 365, Azure, Windows Server 2012 R2, Hyper-V and Windows 10. Based on the ModernBiz events that have run previously, these have been updated to include some of the latest and most relevant content for SMB partners looking to deliver a combination of cloud, on-premises and hybrid solutions to their customers. Some of the content is still being finalised at the moment, but here's what we can currently share to give you an idea of what's going to be included. As this content isn't complete yet there may be some changes, but we will post updates to this as the events get closer.

 Grow Efficiently - Coming January 2016

Module 1: Deploying and Updating Windows 10 

In this session you will get learn the technical ins and outs of deploying Windows 10, including the in-place upgrade process, configuring new PCs and tablets, image creation and deployment across small and midsize business organizations.     

Module 2: Getting Started with an Office 365 Implementation

This module teaches partners about the technical considerations around planning an Office 365 implementation, steps through creation of an Office 365 tenant, and addresses licensing choices. 

Module 3: Understanding Network Connectivity to Optimize Performance

This module is designed to help partners troubleshoot Office 365 network performance, configure the Office365 client and network proxy for optimal performance, and monitor and troubleshoot Office 365 performance 

Module 4: Office 365 Tools for Setup and Email Migration

This module teaches partners about the Office 365 tools for new setups and email migrations, including the planning, batching and moving of mailboxes. 

Module 5: Office 365 Service Management

This module is designed to teach partners how to manage change in Office 365, as well as how to think about managing service health, incidents, and planned maintenance. 

Appendix: Securing Windows 10  

Learn about the very latest advancements in Security on Windows 10, from Windows Hello and Passport to Device Guard, Cloud Identity & Information Protection, Azure Active Directory and Enterprise Data Protection.

Limited number of seats, don't miss out REGISTER NOW 

 

 

Business Anywhere - Coming February 2016

Module 1: Windows 10 Management (with IE 11 and Edge) 

This session dives into device management in Windows 10, with a particular focus in the technical aspects of Windows as a Service and Mobile Device Management.  

Module 2: Mobile Device and Identity Management with Intune, EMS, and Office 365

This module covers the challenges and requirements that SMB's face when implementing a mobile device management (MDM) and Bring your own device (BYOD) infrastructure, setup and usage Intune, setup and usage of the Azure Enterprise Mobility Suite (EMS), and the MDM capabilities of Office 365.

Module 3: Remote Desktop Service and Azure Remote App

This module teaches students how to design and implement a Windows Server 2012 R2 Remote Desktop Services (RDS), and Virtual Desktop Infrastructure (VDI) environment using on premise servers, and also teaches students how to setup and configure Azure RemoteApp for cloud-based RDP scenarios  

Module 4: Deploying Office 365 ProPlus

This module teaches partners how to deploy Office 365 ProPlus across devices and platforms, including the use of packaging, scripting and automation, and how to diagnose and troubleshoot these deployments. 

Module 5: Skype for Business Conferencing

This module is designed to teach partners to deploy and configure Skype for Business conferencing and dig into how to manage conferencing capabilities, including meetings and groups. 

Limited number of seats, don't miss out REGISTER NOW

 

Safeguard Your Business - In your city this December

Module 1: Azure Backup and ASR 

This module teaches students about storage, Backup, and Disaster recovery. Students learn how to use an Azure Backup to protect on premise and as your virtual machines. They also see how to setup Azure Site Recovery (ASR) and configure disaster recovery for various types of workloads. This module also covers the new capabilities of Azure Backup on premises server (Project Venus). 

Module 2: Securing Windows 10 

Learn about the very latest advancements in Security on Windows 10, from Windows Hello and Passport to Device Guard, Cloud Identity & Information Protection, Azure Active Directory and Enterprise Data Protection.  

Module 3: Data Loss Prevention in Office 365

This module teaches partners how to identify, monitor and protect sensitive information in their organizations with Data Loss Prevention in Office 365 , including policies, setup and filtering. Partners will be introduced to both the admin and user experiences, and learn about Microsoft's approach to Data Loss Prevention across Windows, Office, and Azure. 

Module 4: eDiscovery and Archiving in Office 365

This module is designed to teach partners about the principal uses of eDiscovery in Office 365, including discoverability, retention policies and legal holds, and the capabilities and experiences across Exchange, Outlook, SharePoint and OneDrive. 

Module 5: Office 365 and Azure AD Premium RMS

This module is designed to teach partners about the principals of Rights Management Services, how to optimize and accelerate your deployment of these services, and for protecting sensitive data. 

Appendix: Improving Availability and Recoverability with SQL Server 2014 

This appendix covers topics related to improving availability and recoverability of the SQL Server 2014 on premise infrastructure. Additionally, this module covers on implement various high availability and disaster recovery solutions would span on premise and cloud-based infrastructure with SQL server 2014. 

Limited number of seats, don't miss out REGISTER NOW

 

Connect With Customers (In town this November)

Module 1: Introduction to CRM Online

This module will focus on CRM Online’s architecture and High Availability. Partners will also learn where CRM Online’s datacenters are located as well as the compliance available on those data centers, which will help partners design solutions that match a customer of any size. 

Module 2: CRM Online Integration with Office 365:

In this module, we will focus on how Microsoft Dynamics CRM Online leverages Office 365 and Azure technologies to create a more robust and complete solution for organizations around the world. 

Module 3: CRM Online Integration with Office 365

In this module, we will continue to focus on how Microsoft Dynamics CRM Online leverages Office 365 and Azure technologies to create a more robust and complete solution for organizations around the world

Module 4: CRM Online Integration with Power BI

In this module, partners will learn how to leverage PowerBI to compliment the out of the box Microsoft Dynamics CRM Online reporting options such as Reports, Charts, Lists, and Dashboards. 

Module 5: CRM Online Licensing

This module will focus on understanding the highly customizable and flexible licensing details for Microsoft Dynamics CRM Online, Microsoft Dynamics Marketing, Microsoft Parature, Unified Service Desk, and Microsoft Social Engagement.

Limited number of seats, don't miss out REGISTER NOW

Top Contributors Awards! Love it, Like it, link it!

$
0
0

Welcome back for another analysis of contributions to TechNet Wiki over the last week.

First up, the weekly leader board snapshot...

 

Now, let's break that down a bit more. As always, here are the results of another weekly crawl over the updated articles feed.

 

Ninja AwardMost Revisions Award  
Who has made the most individual revisions
 

 

#1 Peter Geelen - MSFT with 48 revisions.

  

#2 Ed Price - MSFT with 28 revisions.

  

#3 Yokels RECLAIMED with 24 revisions.

  

Just behind the winners but also worth a mention are:

 

#4 Deva [MSFT] with 21 revisions.

  

#5 Ken Cenerelli with 15 revisions.

  

#6 Arleta Wanat with 13 revisions.

  

#7 Prem Rana with 10 revisions.

  

#8 Mauricio Feijo with 9 revisions.

  

#9 Yan Grenier - MTFC with 9 revisions.

  

#10 Ruud Borst with 9 revisions.

  

 

Ninja AwardMost Articles Updated Award  
Who has updated the most articles
 

 

#1 Peter Geelen - MSFT with 28 articles.

  

#2 Ken Cenerelli with 12 articles.

  

#3 Prem Rana with 10 articles.

  

Just behind the winners but also worth a mention are:

 

#4 Yokels RECLAIMED with 9 articles.

  

#5 Yan Grenier - MTFC with 7 articles.

  

#6 Emiliano Musso with 4 articles.

  

#7 Saeid Hasani with 4 articles.

  

#8 Durval Ramos with 4 articles.

  

#9 Hezequias Vasconcelos with 3 articles.

  

#10 Mauricio Feijo with 3 articles.

  

 

Ninja AwardMost Updated Article Award  
Largest amount of updated content in a single article
 

 

The article to have the most change this week was Office 365 Knowledge Base Library, by MS2065 [MSFT]

This week's reviser was MS2065 [MSFT]

Just to show how busy these guys are and how relevant TNW is, here is the world renowned MS2065. Rumors are that this is actually Bill Gates. He is so obsessive, he watches over us all, gently tweaking and buffing behind the scenes.

 

Ninja AwardLongest Article Award  
Biggest article updated this week
 

 

This week's largest document to get some attention is Arquivo Antigo - Agenda de Publicações no Blog Wiki Ninjas Brasil, by Durval Ramos

This week's reviser was Durval Ramos

Behold, the Brazilian contingent of the TechNet Team. Showing us how TNW transcends borders and brains :)

.  

Ninja AwardMost Revised Article Award  
Article with the most revisions in a week
 

 

This week's most fiddled with article is Microsoft Small Basic 1.2 Release Notes, by Ed Price - MSFT. It was revised 20 times last week.

This week's reviser was Ed Price - MSFT

Ed has become a long standing and active supporter of SB, and here he continues the tradition. Small language, big following!

 

Ninja AwardMost Popular Article Award  
Collaboration is the name of the game!
 

 

The article to be updated by the most people this week is Wiki Ninjas Blog Authoring Schedule , by Ed Price - MSFT

And here we have the English speaking bloggers, pouring in to grab a slot to wax lyrical about TNW. We love you all.

This week's revisers were Andy ONeill, Recep YUKSEL, Sandro Pereira, Yagmoth555, Emiliano Musso, Margriet Bruggeman, Luiz Henrique Lima Campos [MVP], Davut EREN, Alan Nascimento Carlos, Hezequias Vasconcelos& Durval Ramos

 

Being a regular winner, the second article to be updated by the most people this week is TechNet Guru Contributions - October 2015, by XAML guy

Lots of awesome content flooding in for Shocktober! Go check it out and help buff it up a little before judging!

This week's revisers were Mauricio Feijo, Idan Vexler, Mang Alex, Janardhan Bikka, Yashwant Vishwakarma, Ed Price - MSFT, .paul. _& SYEDSHANU

 

 

Ninja AwardNinja Edit Award  
A ninja needs lightning fast reactions!
 

 

Below is a list of this week's fastest ninja edits. That's an edit to an article after another person

 

Ninja AwardWinner Summary  
Let's celebrate our winners!
 

 

Below are a few statistics on this week's award winners.

Most Revisions Award Winner
The reviser is the winner of this category.

Peter Geelen - MSFT

Peter Geelen - MSFT has been interviewed on TechNet Wiki!

Peter Geelen - MSFT has featured articles on TechNet Wiki!

Peter Geelen - MSFT has won 63 previous Top Contributor Awards. Most recent five shown below:

Peter Geelen - MSFT has TechNet Guru medals, for the following articles:

Peter Geelen - MSFT's profile page



Most Articles Award Winner
The reviser is the winner of this category.

Peter Geelen - MSFT

Peter Geelen - MSFT is mentioned above.



Most Updated Article Award Winner
The author is the winner, as it is their article that has had the changes.

MS2065 [MSFT]

This is the first Top Contributors award for MS2065 [MSFT] on TechNet Wiki! Congratulations MS2065 [MSFT]!

MS2065 [MSFT] has not yet had any interviews, featured articles or TechNet Guru medals (see below)

MS2065 [MSFT]'s profile page



Longest Article Award Winner
The author is the winner, as it is their article that is so long!

Durval Ramos

Durval Ramos has been interviewed on TechNet Wiki!

Durval Ramos has featured articles on TechNet Wiki!

Durval Ramos has won 19 previous Top Contributor Awards. Most recent five shown below:

Durval Ramos has TechNet Guru medals, for the following articles:

Durval Ramos's profile page



Most Revised Article Winner
The author is the winner, as it is their article that has ben changed the most

Ed Price - MSFT

Ed Price - MSFT has been interviewed on TechNet Wiki!

Ed Price - MSFT has featured articles on TechNet Wiki!

Ed Price - MSFT has won 124 previous Top Contributor Awards. Most recent five shown below:

Ed Price - MSFT has TechNet Guru medals, for the following articles:

Ed Price - MSFT's profile page



Most Popular Article Winner
The author is the winner, as it is their article that has had the most attention.

Ed Price - MSFT

Ed Price - MSFT is mentioned above.

XAML guy

XAML guy has been interviewed on TechNet Wiki!

XAML guy has featured articles on TechNet Wiki!

XAML guy has won 91 previous Top Contributor Awards. Most recent five shown below:

XAML guy has TechNet Guru medals, for the following articles:

XAML guy's profile page



Ninja Edit Award Winner
The author is the reviser, for it is their hand that is quickest!

.paul. _

.paul. _ has been interviewed on TechNet Wiki!

.paul. _ has featured articles on TechNet Wiki!

.paul. _ has won 6 previous Top Contributor Awards. Most recent five shown below:

.paul. _ has TechNet Guru medals, for the following articles:

.paul. _'s profile page



So much to love here, not enough time to rad it all. I hope you find something you like.

 

Best regards,
Pete Laker (XAML guy)

 

Transform ANY nested table to Pivot Table with function query

$
0
0

Prepare to be amazed :)

This is the fourth post in the series The Definitive Guide to Unpivot with Power Query in Excel.

In this series we walk you through one of the coolest data transformation features in Excel - The Unpivot transformation. Powered by Power Query technology, the Unpivot transformation is available for everyone using the new Get & Transform section of the Data tab in Excel 2016, or as an Add-in in previous versions of Excel.

In my last post here, I showed you how to transform a nested table into a PivotTable. In today's post we will move one step further, and learn how to transform ANY nested table into a PivotTable. Why ANY? Because no matter how many fields you have as nested rows and columns, you will be able to transform it to a PivotTable if you follow today's walkthrough.

We will start with a specific scenario, and then share with you a more generic function query which uses Power Query expression (M) to address any type of nested table.

Let's start with a table that uses Continent, Country and City as row fields, and School Type, Gender and Class as column fields:

Our goal is to unpivot & transform this 3x3 dimension table above the following table which can be then used by PivotTables and PivotCharts:

 

Let's start

Download the workbook from here, open it with Excel 2016 (You can also perform the steps in Excel 2010 and Excel 2013 after you install Power Query Add-In).

Select any cell in the range, and click From Table in the Get & Transform section of the Data tab (or in Power Query tab if you use the Add-In in older versions of Excel).

Note: If you use this tutorial on your own table, the use of From Table will convert the range into a Table. If you don't want to turn the range into a table,you can assign a name to the range, before you click From Table. To assign a name to a range, select the range and assign it with a name in the highlighted box:

In this step we will fill the missing values for the continent and country columns.

In the Query Editor, select the first two columns, right click on one of the headers, click Fill, and then click Down.

The next step is to fill the null values of first two rows with School Type and Gender values. Since we don't have a Fill Right command we will transpose the table, and then apply Fill Down transformation on the School Type and Gender headers.

In the Query Editor, go to the Transform tab and click Transpose.

Select the first two columns that represents the School Type and Gender, right click on one of the headers, and select Fill then click Down.

 

In the next step we will merge between the first three columns that represent School Type, Gender and Class, so we will later be able to use the combinations of School Type/Gender/Class as headers.

Let's assume that we can use the semicolon character as a separator in the merge operation.

Note: If your tables contains a semicolon characters in the header names, you should select a different character than the semicolon as the separator.

Select the first three columns, right click on one of the headers, and click Merge Columns

In Merge Columns dialog, select Semicolon as the Separator, and click OK.

Now let's transpose back the table, and use our merged School-Type;Gender;Class as headers. 

In the Transform tab, click Transpose, and then click Use First Row As Headers.

Let's rename the first three columns to Continent, Countryand City.

(To rename a column header simply double click on its label, and type the new name, then press Enter or click outside of the header).

Now we are ready to the Unpivot magic -

Select the columns Continent, Country and City, right click on one of the headers and click Unpivot Other Columns.

Rename the highlighted column headers to School Type, Genderand Class. You can also change the Valuecolumn to Grade.

Finally, in Home tab, click Close & Load.

That's it. We got the desired format as a new table in the workbook. Now we can create a PivotTable and re-pivot the data as we see fit.

In the next section we will discuss how to apply the same kind of transformation to any kind of nested table.

The Generic Function Query

Now let's move the generic solution. As we stated at the beginning of this tutorial, we want to perform this unpivot transformation on nested tables with ANY number of nested row and column headers.

Based on the transformation we did previously, I created a function query, FnUnpivotNestedTable, that perform the same sequence of transformations on any kind of nested table. The function query can be found in our workbook.

To see an example of a query that uses this function, open the workbook here, click Show Queries, and double click on the query How To Use The Function.

  

 

If you have a workbook with a nested table, and would like to apply the function above on your table, perform the following steps:

To copy and paste the function query:

  1. Open this workbook.
  2. In Data tab, click Show Queries.
  3. Double click on the function query FnUnpivotNestedTable.
  4. In the Query Editor, click Advanced Editor.
  5. Copy the entire code from the Advanced Editor dialog, click Done, and then close the Query Editor.
  6. Open your workbook.
  7. In Data tab, click NewQuery, then click From Other Sources and click Blank Query.
  8. In the Query Editor, click Advanced Editor.
  9. Paste the code from step 5, and click Done.
  10. In the Query Editor, Query Settings pane, rename the query to FnUnpivotNestedTable.
  11. In Home tab, click Close & Load.

To use the function query on your nested table:

  1. Select your nested table.
  2. In Data tab, click From Table.
  3. In the Create Table dialog, uncheck the box My table has headers and click OK.
  4. In the Query Editor remove subtotal and total rows and columns. We didn't discuss this step in this tutorial, but this is a basic step. Use the filtering functionality and the Remove Top/Bottom rows commands to remove unnecessary rows, and delete columns which has subtotals or totals.
  5. Click Advanced Editor and add the FnUnpivotNestedTable formula. For example:
    Result = FnUnpivotNestedTable(#"Previous step", {"Continent", "Country", "City"},{"School Type", "Gender", "Class"})

Here is a screenshot of the Advanced Editor with the example from our workbook. The second command is the call for the function FnUnpivotNestedTable. It requires 3 parameters:

  1. The table to unpivot
  2. The row headers as a list - On our example: {"Continent", "Country", "City"}
  3. The column headers as a list - On our example: {"School Type", "Gender", "Class"}

Note: You can apply any number of nested row headers and columns. If you have a single row header and multiple nested columns (or vice versa), it will work as well.

Next - How we created FnUnpivotNestedTable


Follow me on Twitter to get more cool stuff.

 

FnUnpivotNestedTable

$
0
0

On the previous page of this tutorial, we used a function query FnUnpivotNestedTableto unpivot ANY nested tables. In this section we will drill down to the function itself, and show you how we created it.

Note: This section is dedicated for advanced users of Power Query, and requires moderate knowledge of the Power Query formula language (M). If you feel this section is not relevant for you, you can still use the tutorial on the previous page.

We will start with the auto-generated M code that was created when in this tutorial when we transformed the table in this workbook through the Query Editor.

Let's highlight all the sections in the code that were specific for the table we used and transformed the expression to a generic one.


let
    Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
    #"Filled Down" = Table.FillDown(Source,{"Column1", "Column2"}),
    #"Transposed Table" = Table.Transpose(#"Filled Down"),
    #"Filled Down1" = Table.FillDown(#"Transposed Table",{"Column1", "Column2"}),
    #"Merged Columns" = Table.CombineColumns(#"Filled Down1",{"Column1", "Column2", "Column3"},Combiner.CombineTextByDelimiter(";", QuoteStyle.None),"Merged"),
    #"Transposed Table1" = Table.Transpose(#"Merged Columns"),
    #"Promoted Headers" = Table.PromoteHeaders(#"Transposed Table1"),
    #"Renamed Columns" = Table.RenameColumns(#"Promoted Headers",{{";;", "Continent"}, {";;_1", "Country"}, {";;_2", "City"}}),
    #"Unpivoted Other Columns" = Table.UnpivotOtherColumns(#"Renamed Columns", {"Continent", "Country", "City"}, "Attribute", "Value"),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Unpivoted Other Columns","Attribute",Splitter.SplitTextByDelimiter(";"),{"Attribute.1", "Attribute.2", "Attribute.3"}),
    #"Renamed Columns1" = Table.RenameColumns(#"Split Column by Delimiter",{{"Attribute.1", "School Type"}, {"Attribute.2", "Gender"}, {"Attribute.3", "Class"}})
in
    #"Renamed Columns1"


To start our function, let's add the function definition. The function will accept a table, and a list of row and column header names (e.g. {"Continent", "Country", "City"} and {"School Type", "Gender", "Class"} ).


Before:

let
    Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],

     ...

in
    #"Renamed Columns1"


After:

(Source as table, RowHeaders, ColumnHeaders) =>
let

  ...

in
    #"Renamed Columns1"


We will now handle the Fill Down transformation, by creating a dynamic list that contains the first N-1 column names


Before:

#"Filled Down" = Table.FillDown(Source,{"Column1", "Column2"}),


After:

//Get a list of the first n column names of a table
GetFirstHeaders = (src as table, n as number) as list => List.Range(Table.ColumnNames(src), 0, n),

RowHeaderCount = List.Count(RowHeaders),
ColumnHeaderCount = List.Count(ColumnHeaders),
    
//The row headers we should apply fill down
FillDownRowHeaders = GetFirstHeaders(Source, RowHeaderCount - 1),

//Apply Fill Down on row headers
FilledDownRowFields = Table.FillDown(Source,FillDownRowHeaders),


Now let's handle the transpose and second fill down transformation.

The challenge here is to create a dynamic list of "Column1", "Column2", ..., "ColumnN-1" based on the number of items in the Column Headers that were received as a parameter list (e.g. {"School Type", "Gender", "Class"} ). Note that we create here a list of "Column1", "Column2", etc. and only use the input list to determine the first columns we need to fill down.


Before:

#"Transposed Table" = Table.Transpose(#"Filled Down"),
#"Filled Down1" = Table.FillDown(#"Transposed Table",{"Column1", "Column2"}),


After:

//Transpose table
TransposedTable = Table.Transpose(FilledDownRowFields),

//This line will be called later for a merge operation. For now all we need to know that it contains a list of "Column1", "Column2", ..., "ColumnN".
ColumnHeadersToCombine = GetFirstHeaders(TransposedTable, ColumnHeaderCount), 
   

//The column headers we should fill down
FilledDownColumnHeaders = List.Range(ColumnHeadersToCombine, 0, ColumnHeaderCount - 1),

//Apply Fill Down on column headers
FilledDownColumnFields = Table.FillDown(TransposedTable, FilledDownColumnHeaders),


The next three step is relatively easy. At the previous step we built a list of "Column1", ... "ColumnN", we will use it in the CombineColumns step.


Before:

#"Merged Columns" = Table.CombineColumns(#"Filled Down1",{"Column1", "Column2", "Column3"},Combiner.CombineTextByDelimiter(";", QuoteStyle.None),"Merged"),
#"Transposed Table1" = Table.Transpose(#"Merged Columns"),
#"Promoted Headers" = Table.PromoteHeaders(#"Transposed Table1"),


After:

//Merge columns with a semicolon delimiter
MergedColumns = Table.CombineColumns(FilledDownColumnFields, ColumnHeadersToCombine, Combiner.CombineTextByDelimiter(";", QuoteStyle.None),"Merged"),
    

//Transpose the table back
TransposedBackTable = Table.Transpose(MergedColumns),
    

//Promote first tow as headers
PromotedHeaders = Table.PromoteHeaders(TransposedBackTable),
   


Now comes the most complex code to build the second parameter for the RenameColumns step. We start by using Text.PadStart to create a string of semicolons whose size is N-1, where N is the size of ColumnHeaders input parameter. Then we create the steps that that build the strings ";;", ";;_1" and ";;_2" on our example, and finally, we combine between the latter strings and the items in ColumnHeaders.


Before:

#"Renamed Columns" = Table.RenameColumns(#"Promoted Headers",{{";;", "Continent"}, {";;_1", "Country"}, {";;_2", "City"}}),


After:

//In this section we build the necessary text and lists that will allow us to rename the columns to the values in RowHeaders
//Here was the original step that we will dynamically build:
//RenamedRowHeaders = Table.RenameColumns(TransposedBackTable,{{";;", "Continent"}, {";;_1", "Country"}, {";;_2", "City"}}),
    
SemicolonsText = Text.PadStart("", ColumnHeaderCount - 1, ";"),
HelperListPhase1 = List.Transform(List.Numbers(0, RowHeaderCount), each if (_ = 0) then SemicolonsText else SemicolonsText & "_" & Text.From(_)),
HelperListForRowHeaders = Table.ToRows(Table.FromColumns({HelperListPhase1, RowHeaders})),
    
//Here we rename the columns that contains the input Row Headers
RenamedRowHeaders = Table.RenameColumns(PromotedHeaders, HelperListForRowHeaders), 


In the last two steps, we apply UnpivotOtherColumns on the first columns that now contains the original Row Headers (e.g. Continent, Country & City). We can use the input RowHeaders list to replace the static column names. Then we can perform the SplitColumn operation and use the original ColumnHeaders list instead of {"Attribute.1", "Attribute.2", "Attribute.3"}. By doing so, we no longer need the Rename Column step.


 

Before:

    #"Unpivoted Other Columns" = Table.UnpivotOtherColumns(#"Renamed Columns", {"Continent", "Country", "City"}, "Attribute", "Value"),
    #"Split Column by Delimiter" = Table.SplitColumn(#"Unpivoted Other Columns","Attribute",Splitter.SplitTextByDelimiter(";"),{"Attribute.1", "Attribute.2", "Attribute.3"}),
    #"Renamed Columns1" = Table.RenameColumns(#"Split Column by Delimiter",{{"Attribute.1", "School Type"}, {"Attribute.2", "Gender"}, {"Attribute.3", "Class"}})
in
    #"Renamed Columns1"


 

After:

    //Here we perform the unpivot step
    UnpivotedOtherColumns = Table.UnpivotOtherColumns(RenamedRowHeaders, RowHeaders, "Attribute", "Value"),

    //Here we split back the Column Headers
    SplitColumnByDelimiter = Table.SplitColumn(UnpivotedOtherColumns,"Attribute",Splitter.SplitTextByDelimiter(";"), ColumnHeaders),

in

    SplitColumnByDelimiter


That's it, we modified the original code into a function that can accept any kind of nested table and perform the unpivot operation on it.

Here is the new function:

(Source as table, RowHeaders, ColumnHeaders) =>
let
 
    // Get a list of the first n column names of a table
    GetFirstHeaders = (src as table, n as number) as list =>
  List.Range(Table.ColumnNames(src), 0, n),

    RowHeaderCount = List.Count(RowHeaders),
    ColumnHeaderCount = List.Count(ColumnHeaders),
   
    //The row headers we should apply fill down
    FillDownRowHeaders = GetFirstHeaders(Source, RowHeaderCount - 1),

    //Apply Fill Down on row headers
    FilledDownRowFields = Table.FillDown(Source,FillDownRowHeaders),

    //Transpose table
    TransposedTable = Table.Transpose(FilledDownRowFields),

    //The columns that we will merge together
    ColumnHeadersToCombine = GetFirstHeaders(TransposedTable, ColumnHeaderCount), 
   
    //The column headers we should fill down
    FilledDownColumnHeaders = List.Range(ColumnHeadersToCombine, 0, ColumnHeaderCount - 1),

    //Apply Fill Down on column headers
    FilledDownColumnFields = Table.FillDown(TransposedTable,FilledDownColumnHeaders),

    //Merge columns with a semicolon delimiter
    MergedColumns = Table.CombineColumns(FilledDownColumnFields, ColumnHeadersToCombine, Combiner.CombineTextByDelimiter(";", QuoteStyle.None),"Merged"),
   
    //Transpose the table back
    TransposedBackTable = Table.Transpose(MergedColumns),
   
    //Promote first tow as headers
    PromotedHeaders = Table.PromoteHeaders(TransposedBackTable),
   
    //In this section we build the necessary text and lists that will allow us to rename the columns to the values in RowHeaders
    //Here was the original step that we will dynamically build:
    //RenamedRowHeaders = Table.RenameColumns(TransposedBackTable,{{";;", "Continent"}, {";;_1", "Country"}, {";;_2", "City"}}),
   
    SemicolonsText = Text.PadStart("", ColumnHeaderCount - 1, ";"),
    HelperListPhase1 = List.Transform(List.Numbers(0, RowHeaderCount), each if (_ = 0) then SemicolonsText else SemicolonsText & "_" & Text.From(_)),
    HelperListForRowHeaders = Table.ToRows(Table.FromColumns({HelperListPhase1, RowHeaders})),
   
    //Here we rename the columns that contains the input Row Headers
    RenamedRowHeaders = Table.RenameColumns(PromotedHeaders, HelperListForRowHeaders),

    //Here we perform the unpivot step
    UnpivotedOtherColumns = Table.UnpivotOtherColumns(RenamedRowHeaders, RowHeaders, "Attribute", "Value"),

    //Here we split back the Column Headers
    SplitColumnByDelimiter = Table.SplitColumn(UnpivotedOtherColumns,"Attribute",Splitter.SplitTextByDelimiter(";"), ColumnHeaders),
 
    //Change the type of Value columns to number
    ChangeToDecimal = Table.TransformColumnTypes(SplitColumnByDelimiter ,{{"Value", type number}})
in
    ChangeToDecimal


Hope you enjoyed this blog post.

Follow me on Twitter to get more cool stuff.

Send SCOM Custom Performance Counters to OMS

$
0
0

Since the release of the Operations Management Suite there have been a lot of solutions and intelligence packs been added. One of the latest ones was a pack to upload Performance Counters to OMS by defining them in the OMS console.

This performance collection is being setup by OMS intelligence packs that are distributed to the SCOM Management Server and Agent.

Intelligence Pack ID

Intelligence Pack Display Name

Management Pack Bundle

Microsoft.IntelligencePacks.Performance

Microsoft System Center Advisor Performance collection library

Microsoft.IntelligencePacks.Performance.mpb

Microsoft.IntelligencePacks.Types

Microsoft System Center Advisor Types Library

Microsoft.IntelligencePacks.Types.mpb

 

This has brought up the question, if it is possible to also upload custom performance counters from SCOM into OMS. My colleague Wei Hao Lim has shown, that it is indeed possible by using two Write Actions from these Intelligence Packs. For details, look at http://blogs.msdn.com/b/wei_out_there_with_system_center/archive/2015/09/29/oms-collecting-nrt-performance-data-from-an-opsmgr-powershell-script-collection-rule-created-from-a-wizard.aspx.

For the upload to work, the Performance Object Name has to be built like this: \\FQDN\ObjectName. This means, that if you want to just add the new write actions to a standard rule, you end up with Performance Objects in SCOM that have different ObjectNames for each agent.

While this might just be a cosmetic problem, I don't really like this, having fought many battles with the SQL Performance Counters that include the SQL engine name in many cases. So my goal was to create write actions into OMS based on the ones that Wei has used, but be able to send a normal looking ObjectName into SCOM database and DWH and a OMS compliant ObjectName into OMS. To achieve this, I created two new WriteActionTypes like this:

 

 

Basically what's happening here is that I create the Performance Counters with a "normal" SCOM ObjectName and when it is time to send them to OMS, I run them through a second DataGenericMapper to change the ObjectName into the OMS compliant ObjectName format of \\FQDN\ObjectName. After that, they are fed into the same write actions, that Wei has described in his Blog Post.

As you can see, the first DataGenericMapper gets the PropertyBag output of your custom Performance Counter Script in the well-known $Data/Property[@Name='<PropertyName>']$ format. The second one gets the same Data that is then already on the Data bus, therefore can be referred to by $Data/<PropertyName>$. If you target a class coming off of Microsoft.Windows.Computer, the FQDN translates to

\\$Target/Host/Property[Type="Windows!Microsoft.Windows.Computer"]/NetworkName$

making the whole term look like

\\$Target/Host/Property[Type="Windows!Microsoft.Windows.Computer"]/NetworkName$\$Data/ObjectName$.

These two new WriteActionTypes can then be used in your Performance Collection Rule like this:

By doing this, you get a "SCOM-like" Performance ObjectName in SCOM while still being able to comply with the OMS ObjectName format and upload your customer performance data into OMS.

Here is the counter in SCOM:

And here the same counter in OMS:

 

So if you are not shy to edit Management Packs in XML, import the Management Pack and start using the new Write Actions.

The reference should look similar to this:

Now add the new Write Actions to your Rule. In the XML it should then look like this:

Now be patient, it might take a few hours for the counters to show up in OMS, but eventually they will.

Hope you are having fun with this and start using OMS more often…

You can download the Management Pack from https://gallery.technet.microsoft.com/Sample-MP-WriteActions-to-b720052e

 

Some general Information about Near Real Time (NRT) Performance Data in OMS:
http://blogs.technet.com/b/momteam/archive/2015/09/01/near-real-time-performance-data-collection-in-oms.aspx

 

 

Disclaimer:
All information on this blog is provided on an as-is basis with no warranties and for informational purposes only. Use at your own risk. The opinions and views expressed in this blog are those of the author and do not necessarily state or reflect those of my employer.

 


Unpivot ANY nested table and load to Power BI Desktop

$
0
0

Earlier today I shared with you a function query here that unpivots ANY nested table into Excel and transforms summarized tables into a PivotTable. This post will show you how to import the function query and use it in Power BI Desktop on any table you load from Excel.

It is a common scenario to have nested/summarized tables in Excel that you need to unpivot before you build the dashboard with Power BI Desktop. With the function query which is described here, you can load ANY nested table to the Power BI Desktop and transform it to a format you can start working with.

Open Power BI Desktop, expand the Get Data menu, and click Blank Query.

Click Advanced Editor, paste the M expression below, and then click Done:


(Source as table, RowHeaders, ColumnHeaders) =>
let
 
    // Get a list of the first n column names of a table
    GetFirstHeaders = (src as table, n as number) as list =>
  List.Range(Table.ColumnNames(src), 0, n),

    RowHeaderCount = List.Count(RowHeaders),
    ColumnHeaderCount = List.Count(ColumnHeaders),
   
    //The row headers we should apply fill down
    FillDownRowHeaders = GetFirstHeaders(Source, RowHeaderCount - 1),

    //Apply Fill Down on row headers
    FilledDownRowFields = Table.FillDown(Source,FillDownRowHeaders),

    //Transpose table
    TransposedTable = Table.Transpose(FilledDownRowFields),

    //The columns that we will merge together
    ColumnHeadersToCombine = GetFirstHeaders(TransposedTable, ColumnHeaderCount), 
   
    //The column headers we should fill down
    FilledDownColumnHeaders = List.Range(ColumnHeadersToCombine, 0, ColumnHeaderCount - 1),

    //Apply Fill Down on column headers
    FilledDownColumnFields = Table.FillDown(TransposedTable,FilledDownColumnHeaders),

    //Merge columns with a semicolon delimiter
    MergedColumns = Table.CombineColumns(FilledDownColumnFields, ColumnHeadersToCombine, Combiner.CombineTextByDelimiter(";", QuoteStyle.None),"Merged"),
   
    //Transpose the table back
    TransposedBackTable = Table.Transpose(MergedColumns),
   
    //Promote first tow as headers
    PromotedHeaders = Table.PromoteHeaders(TransposedBackTable),
   
    //In this section we build the necessary text and lists that will allow us to rename the columns to the values in RowHeaders
    //Here was the original step that we will dynamically build:
    //RenamedRowHeaders = Table.RenameColumns(TransposedBackTable,{{";;", "Continent"}, {";;_1", "Country"}, {";;_2", "City"}}),
   
    SemicolonsText = Text.PadStart("", ColumnHeaderCount - 1, ";"),
    HelperListPhase1 = List.Transform(List.Numbers(0, RowHeaderCount), each if (_ = 0) then SemicolonsText else SemicolonsText & "_" & Text.From(_)),
    HelperListForRowHeaders = Table.ToRows(Table.FromColumns({HelperListPhase1, RowHeaders})),
   
    //Here we rename the columns that contains the input Row Headers
    RenamedRowHeaders = Table.RenameColumns(PromotedHeaders, HelperListForRowHeaders),

    //Here we perform the unpivot step
    UnpivotedOtherColumns = Table.UnpivotOtherColumns(RenamedRowHeaders, RowHeaders, "Attribute", "Value"),

    //Here we split back the Column Headers
    SplitColumnByDelimiter = Table.SplitColumn(UnpivotedOtherColumns,"Attribute",Splitter.SplitTextByDelimiter(";"), ColumnHeaders)
 
in
    SplitColumnByDelimiter


Now import this workbook to your Power BI Desktop.

In the Navigator dialog, select Sheet1 and click OK.

The following table will be shown. You must agree that there is nothing you can do with this type of table structure :)

Rename the function query Query1to FnUnpivotNestedTable, and click the fx button in the formula bar.

Paste the formula below and press Enter:

= FnUnpivotNestedTable(#"Changed Type", {"Continent", "Country", "City"}, {"School Type", "Gender", "Class"})

Change the Valuecolumn to Whole Number, and click Close & Apply.

That is it. You are now ready to build amazing reports with the data. You can see all the relevant fields in the Fields Well of the Reports view, and select your favorite visual.

Enjoy :)

How to install the operating system based upon the size of the disks using MDT.

$
0
0

In this blog, I will illustrate a cheeky little improvisation in MDT that will enable you to install the operating system based upon the disk size of the client machine. Here was one of the interesting issue that i recently came across. 

User has two disks in the servers, one was a smaller SSD disk while the larger one was meant to store the data and operating system, and interestingly there were no specific order in which disks were enumerate when booted up in winpe. i.e SSD disk will sometime be disk 0 and sometime will enumerate as disk 1, of two different servers, even though make and model were identical. so we cannot just manually change the disk number in "format and partition" step in disk to mitigate the problem of OS getting installed on the SSD disk. Off course putting a custom diskpart script was also out of equation since if a format and partition step is not called, it will end up failing the deployment. 

So here is what we did to eventually get it working.. 

1. create a variable in TS and set this value as unassigned. In my lab i used my name Mayank to set the value as 'undefined'. (It should be called just before format and partition step in your TS.)

2. create a sample script, copy it in notepad and rename it "decisionscript.ps1". this script essentially identifies which disk is largest, the return value will be 0,1,2.. etc. Copy this script in scriptroot folder and call this script just after declaring the variable and before format and partition step. 

$array=(gwmi win32_DiskDrive).size

$m=0
$j=0
foreach($i in $array)

{
If ($array[$m] -le $array[$j+1])

{

$m=$j+1

}
$j=$j+1
}

$tsenv:var = $mayank

3. Create "Format and partition" steps in MDT equals to the number of disks, (in our case there were two steps) and set an 'if' condition so that if value of the variable mayank is 0 "format and partition" step of MDT that will format disk 0 will be called and so on. At any point in time, only one step will be executed. 

4. Now we need to add support for powershell in MDT and also need to ensure that $tsenv is available for us, for this, Mount litetouchpe


5. Copy the Microsoft.BDD.TasksequenceModule and ZTIUtility from tools\modules folder inside deployment share and copy it at c:\windows\system32\WindowsPowershell\v1.0\Modules folder inside the deployment share.

6. Commit the changes and unmount the wim. and use this to start the deployment.

thanks for reading!

Обсуждение нововведений Exchange Server 2016 на встрече сообщества Unified Communication User Community

$
0
0

27 октября 2015 года с 18:00 до 21:00 (МСК) пройдет седьмая встреча IT-сообщества Unified Communication User Community (UC2) на тему: Нововведения в Exchange Server 2016.

Встреча буде проводится в 2-х форматах:

Для всех видов участия необходима обязательная регистрация

Адрес проведения:Технологический центр Microsoft (г. Москва, ул. Лесная 9. м. Белорусская)

 

Программа:

17:45 – 18:00 Welcome coffee

18:00 – 20:00 Нововведения в Exchange Server 2016. Дмитрий Хребин, Microsoft.

В ходе доклада Дмитрий раскроет следующие темы:

  1. Изменения в архитектуре Information Store включая DBDivergence Detection.

  2. Office Online Server. Что это и зачем нужно.

  3. Изменения в протоколе Exchange ActiveSync v.16

20:00 – 20:45 Сессия вопросов и ответов.

 

О докладчике:

Дмитрий Хребин, сотрудник службы поддержки российского Microsoft, специализацией которого является Exchange Server.

Project 2016 の新機能

$
0
0

(この記事は 2015 年 9 月 30 日に Office Blogs に投稿された記事 What’s new in Project 2016の翻訳です。最新情報については、翻訳元の記事をご参照ください。)

 

このたびマイクロソフトは Project 2016 のリリースを発表しました。今回のバージョンでは、Project Professional、Project Pro for Office 365、Project Online の機能が大幅に更新されています。Project 2016 には、リソース管理やリソース最大利用可能時間の計画に関してお客様から多数のご要望が寄せられていた機能を実装しています。これらの主要機能は、新しいリソース予約、リソース マネージャー向けの新しいエクスペリエンス、最大利用可能時間のヒート マップなどとして提供されます。Project 2016 は、リソース使用率の管理と最適化に必要なエンドツーエンドのエクスペリエンスを実現します。

リソース管理機能のほか、タイムライン機能の改善、操作アシストの統合、Office アドインのフル サポートなど、Project での生産性向上に役立つ多くの機能強化や新機能が提供されています。

リソース予約

プロジェクト マネージャーは、リソースを確保しプロジェクトのスタッフを適切に配置しなければならないという問題を常に抱えています。Project 2016 を利用すれば、リソースを体系的に要請でき、そのリクエストが承認された場合にはリソースを確保しておくことができます。

予約を作成して承認を求めて提出すると、簡単なワークフローが開始され、リソース マネージャーはそこで要請を承認または却下することができます。

プロジェクト マネージャーは、Project Professional 2016 または Project Pro for Office 365 で常にリソース要請の最新状態を把握できます。リソースが 1 つのプロジェクトに一定の期間割り当てられた後に、別プロジェクトのマネージャーがそのリソースを重複して割り当てようとすると、その旨が通知されます。

リソースマネージャーのエクスペリエンス

リソース マネージャー (またはライン マネージャー) は人員を管理する立場にあるので、フル機能の Project クライアントを使用する必要はないでしょう。そこで、リソース マネージャーがプロジェクト マネージャーと連携できるように、申請されているリソース予約をすべて参照、承認、却下する機能が Project Online に新たに追加されました。さらに、マネージャーは、新しいリソース ビューを使用して、リソースの使用率を瞬時に把握できるようになりました。

リソース最大利用可能時間のヒートマップ

リソースの有効活用と生産性の維持は、どの企業においても戦略上の重要な課題に位置付けられています。Project 2016 では、最大利用可能時間ヒート マップと新しい直観的なレポートを使用して、リソースの使用率をひと目で把握できます。リソースの使用率は低すぎても過剰であっても問題になりますが、この新しいヒート マップではどちらの状態もすばやく把握することができます。

タイムライン

プロジェクト スケジュールの連絡は、どのプロジェクト マネージャーにとっても重要な仕事の一部です。ただし、多くのユーザーにとって、ガント チャートはスケジュールを視覚化するための最適な方法ではありません。タイムラインこそが、プロジェクトのライフサイクルを描くうえで最適な方法といえます。このことを踏まえて、マイクロソフトはタイムライン機能を徹底的に見直し、開始日と終了日がそれぞれ異なり、ユーザー定義の一連のタスクとマイルストーンを含む複数のタイムラインを使用できるようにしました。

この新しいタイムライン視覚化ツールは、以下のような優れた機能を提供します。

  • タイムライン別に開始日と終了日を設定
  • 異なるタイムライン間でドラッグ アンド ドロップ
  • 編集可能なオブジェクトとして PowerPoint で保存

操作アシスト

Project 2016 の数ある機能の中から、必要な機能を探し回ることなく、それがどこにあるのかを質問できるとしたら、これほどすばらしいことはありません。操作アシスト機能を利用すると、それが可能になります。さらに便利なことに、操作アシストの回答はボタンになっているので、回答をクリックするだけで必要な操作を実行できます。これで大幅に時間を節約できます。

Office アドインによる読み取り/書き込み

Office アドインは、Project に機能を追加するための拡張機能であり、Office ストアからダウンロードできます。つまり、マイクロソフトとパートナー企業は継続的かつ容易に新機能を配信でき、ユーザーは好みに合わせて Project をカスタマイズできます。Project 2016 では、Office アドインによる作業中のプロジェクトへの読み取り/書き込みのアクセスがすべて可能になるため、かつてないほど充実した拡張機能が提供されるようになります。

入手方法

Project 2016 は、Office 365 の一部としてオンラインですぐに利用できます。現在、最大利用可能時間ヒート マップなどの新しいサービス側の機能は Project Pro for Office 365および Project Onlineで提供されています。お客様がこれらの機能を有効化すると、既存のすべてのリソース計画はリソース予約に自動的にアップグレードされます。オンプレミス版のお客様向けの新しいリソース管理機能は、Project Server 2016 がリリースされる 2016 年春に提供が開始される予定です。

新しい Project をぜひご活用ください。

—Howard Crow (Project エンジニアリング チーム、主任グループ プログラム マネージャー)

 

Project 2016 の新機能

$
0
0

(この記事は 2015 年 9 月 30 日に Office Blogs に投稿された記事 What’s new in Project 2016の翻訳です。最新情報については、翻訳元の記事をご参照ください。)

 

このたびマイクロソフトは Project 2016 のリリースを発表しました。今回のバージョンでは、Project Professional、Project Pro for Office 365、Project Online の機能が大幅に更新されています。Project 2016 には、リソース管理やリソース最大利用可能時間の計画に関してお客様から多数のご要望が寄せられていた機能を実装しています。これらの主要機能は、新しい リソース予約、リソース マネージャー向けの新しいエクスペリエンス、最大利用可能時間のヒート マップなどとして提供されます。Project 2016 は、リソース使用率の管理と最適化に必要なエンドツーエンドのエクスペリエンスを実現します。

リソース管理機能のほか、タイムライン機能の改善、操作アシストの統合、Office アドインのフル サポートなど、Project での生産性向上に役立つ多くの機能強化や新機能が提供されています。

リソース予約

プロジェクト マネージャーは、リソースを確保しプロジェクトのスタッフを適切に配置しなければならないという問題を常に抱えています。Project 2016 を利用すれば、リソースを体系的に要請でき、そのリクエストが承認された場合にはリソースを確保しておくことができます。

予約を作成して承認を求めて提出すると、簡単なワークフローが開始され、リソース マネージャーはそこで要請を承認または却下することができます。

プロジェクト マネージャーは、Project Professional 2016 または Project Pro for Office 365 で常にリソース要請の最新状態を把握できます。リソースが 1 つのプロジェクトに一定の期間割り当てられた後に、別プロジェクトのマネージャーがそのリソースを重複して割り当てようとすると、その旨が通知されます。

リソースマネージャーのエクスペリエンス

リソース マネージャー (またはライン マネージャー) は人員を管理する立場にあるので、フル機能の Project クライアントを使用する必要はないでしょう。そこで、リソース マネージャーがプロジェクト マネージャーと連携できるように、申請されているリソース予約をすべて参照、承認、却下する機能が Project Online に新たに追加されました。さらに、マネージャーは、新しいリソース ビューを使用して、リソースの使用率を瞬時に把握できるようになりました。

リソース最大利用可能時間のヒートマップ

リソースの有効活用と生産性の維持は、どの企業においても戦略上の重要な課題に位置付けられています。Project 2016 では、最大利用可能時間ヒート マップと新しい直観的なレポートを使用して、リソースの使用率をひと目で把握できます。リソースの使用率は低すぎても過剰であっても問題になりますが、こ の新しいヒート マップではどちらの状態もすばやく把握することができます。

タイムライン

プロジェクト スケジュールの連絡は、どのプロジェクト マネージャーにとっても重要な仕事の一部です。ただし、多くのユーザーにとって、ガント チャートはスケジュールを視覚化するための最適な方法ではありません。タイムラインこそが、プロジェクトのライフサイクルを描くうえで最適な方法といえま す。このことを踏まえて、マイクロソフトはタイムライン機能を徹底的に見直し、開始日と終了日がそれぞれ異なり、ユーザー定義の一連のタスクとマイルス トーンを含む複数のタイムラインを使用できるようにしました。

この新しいタイムライン視覚化ツールは、以下のような優れた機能を提供します。

  • タイムライン別に開始日と終了日を設定
  • 異なるタイムライン間でドラッグ アンド ドロップ
  • 編集可能なオブジェクトとして PowerPoint で保存

操作アシスト

Project 2016 の数ある機能の中から、必要な機能を探し回ることなく、それがどこにあるのかを質問できるとしたら、これほどすばらしいことはありません。操作アシスト機 能を利用すると、それが可能になります。さらに便利なことに、操作アシストの回答はボタンになっているので、回答をクリックするだけで必要な操作を実行で きます。これで大幅に時間を節約できます。

Office アドインによる読み取り/書き込み

Office アドインは、Project に機能を追加するための拡張機能であり、Office ストアからダウンロードできます。つまり、マイクロソフトとパートナー企業は継続的かつ容易に新機能を配信でき、ユーザーは好みに合わせて Project をカスタマイズできます。Project 2016 では、Office アドインによる作業中のプロジェクトへの読み取り/書き込みのアクセスがすべて可能になるため、かつてないほど充実した拡張機能が提供されるようになりま す。

入手方法

Project 2016 は、Office 365 の一部としてオンラインですぐに利用できます。現在、最大利用可能時間ヒート マップなどの新しいサービス側の機能は Project Pro for Office 365および Project Onlineで提供されています。お客様がこれらの機能を有効化すると、既存のすべてのリソース計画はリソース予約に自動的にアップグレードされます。オンプレミス版 のお客様向けの新しいリソース管理機能は、Project Server 2016 がリリースされる 2016 年春に提供が開始される予定です。

新しい Project をぜひご活用ください。

—Howard Crow (Project エンジニアリング チーム、主任グループ プログラム マネージャー)

Viewing all 36188 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>