Quantcast
Channel: TechNet Blogs
Viewing all 36188 articles
Browse latest View live

Virtual Entities

$
0
0

Interact with data from external systems using the new virtual entities

Starting with the July 2017 Update for Dynamics 365 (online), virtual entities enable the integration of data residing in external systems by seamlessly representing that data as entities in Dynamics 365, without replication of data and often without custom coding.

Virtual entities replace previous client-side and server-side approaches to integrating external data, which required customized code and suffered from numerous limitations, including imperfect integration, data duplication, or extensive commitment of development resources. In addition, for administrators and system customizers, the use of virtual entities greatly simplifies administration and configuration

A virtual entity is a definition of an entity in the Dynamics 365 platform metadata without the associated physical tables for entity instances created in the Dynamics 365 database. Instead during runtime, when an entity instance is required, its state is dynamically retrieved from the associated external system. Each virtual entity type is associated with a virtual entity data provider and (optionally) some configuration information from an associated virtual entity data source.

Records based on virtual entities are available from all Dynamics 365 (online) clients, including custom applications developed using the Dynamics 365 SDK.

Virtual entities provide these benefits:

  • End users work with the records created by the virtual entity to view the data in fields, grids, search results, and Fetch XML-based reports and dashboards.
  • System customizers can configure the data source record and create virtual entities without writing any code.
  • Developers can implement plugins to read external data using the Dynamics 365 SDK and Dynamics 365 (online) Plug-in Registration tool.

Considerations

In this release, virtual entities have a couple of restrictions:

  • Data is read-only
  • Only organization-owned entities are supported
  • Field-level security is not supported
  • It must be possible to model the external data as a Dynamics 365 entity. This means:
    • All entities in the external data source must have an associated GUID primary key
    • All entity properties must be represented as Dynamics 365 attributes - you can use simple types representing text, numbers, optionsets, dates, images, and lookups
    • You must be able to model any entity relationships in Dynamics 365

Example

In this blog post I'll walk you through a simple way creating a virtual entity.

What you need to test this out:

  1. Access to a service exposing data in OData v4
  2. An entity in this external data source with an associated GUID primary key
  3. Access to the latest preview of Dynamics 365

On the odata.org site you will find a service you can use for testing. Just type services.odata.org/V4/OData/OData.svc/$metadata in your browser to see a list of entities


Fig. 1

If you expand a given entity branch, eg Product, you will need to check if the entity has an associated key of type GUID (requirement #2 above). Please note that this is NOT the fact for the Product entity (ref picture below). SO this entity is not one we can bring in.

Fig. 2

However expanding the entity Advertisement branch, we see that this entity HAS got an associated key of type GUID, so we will use that. Note the ID is spelled "ID" (all capital letters), and also note the Name and AirDate properties - we will get back to those


Fig. 3

To see which records this data set returns we will need the collection (ie plural) name of the entity - for that we type services.odata.org/V4/OData/OData.svc in the browser, and see the collection name is Advertisements

Fig. 4

Using this information we can now type services.odata.org/V4/OData/OData.svc/Advertisements in the brower to see that data set holds two records (these are the two external records we will surface in Dynamics 365 using the new Virtual Entity capability)

Fig. 5

Create a Virtual Entity Data Source

With the data source identified we can proceed to the next step - creating a Virtual Entity Data Source.

In Dynamics 365 click Settings -> Administration (1) and then Virtual Entity Data Sources (2) to open the Data Sources grid.


Fig. 6

In the Data Sources grid click NEW (1) to open the Select Data Provider dialog

Fig. 7

In the Select Data Provider dialog select OData v4 Data Provider (the only option) and then click OK to open the New OData v4 Data Source dialog

Fig. 8


Fig. 9

In the New OData v4 Data Source dialog fill out the three to fields in the General section

  1. In the Name text box type a name of your choice for tha data source (I'll use "Public Service")
  2. In the URL text box type or paste the URL from above (fig 4) services.odata.org/V4/OData/OData.svc)
  3. In the Timeout text box (optional) type the number of seconds to wait for a response from the web service before quitting a data operation

And then click OK (4) to return to the Data Sources grid

Fig. 10


Fig. 11

Create a Virtual Entity

Last thing is to create a Virtual Entity to bring in data from the OData source

Click Settings -> Administration -> Customize the System 

Fig. 12

Create a virtual entity like any custom entity, and then select the Virtual Entity check box (1) - see below

Selecting the check box displays additional information requirements for the data source (2), as well as the External Name and External Collection Name values (3) for the entity definition.

So in this example the

  1. The name of the new Virtual Entity is called Advertisement (a name I typed)
  2. The name of the Data Source is Public Service (as per fig. 10 above)
  3. The External Name = Advertisement and External Collection Name = Advertisements (as per fig. 3 and 4 above)


Fig. 13

Once the new Virtual Entity is created a couple of important things to ensure/edit.

The system automatically creates two fields, one for the ID (1) and one for the name (2). You will need to map those to the external data source names.

Fig. 14

So for the ID field (the primary key) ensure that the name in the External Name text box (1) matches the property name in the data source (2) as per figure 3 above. Observe case sensitivity.

Fig. 15

For the Name field ensure that the name in the External Name text box (1) and the Data Type (2) matches the property name and type in the data source (3)  as per figure 3 above. Observe case sensitivity

Fig. 16

Optionally you can create a third field to bring in the AirDate field from the data source (Date type)  as per figure 3 above. Observe case sensitivity.

Fig. 17

Create a form with the desired fields

Fig. 18

Create a view with the desired columns

Fig. 19

Publish customizations

Fig. 20

You now have a list with the two records from fig 5 above

Fig. 21

Opening a record will display the form you defined

Fig. 22

We now have a new elegant way of surfacing data from external data sources in Dynamics 365 without the data residing in Dynamics 365.

Enjoy.

 

 


Gamescom & IFA: Meine Hardware-Trends für das Weihnachtsgeschäft

$
0
0

Ereignisreiche Wochen liegen hinter Thomas Kowollik. Als Leiter des Consumer und Hardware Geschäfts war er auf der gamescom und der IFA unterwegs. In diesem Post berichtet er über seine ganz persönlichen Highlights der zurückliegenden Messetage.

Ein globaler Beat für die Tech-Welt

Jedes Jahr wieder schlägt das Herz der Tech-Welt für kurze Zeit in Deutschland: Besucher aus allen Teilen unseres Planeten treffen sich auf den Messen gamescom und IFA, um die neuesten Gaming- sowie Tech-Trends live zu erleben. Gemeinsam mit seinen Partnern nutzt Microsoft diese Bühne, um seine aktuellsten Produkte zu präsentieren und den Startschuss für das Weihnachtsgeschäft zu geben. So bieten die Messen den perfekten Rahmen, um dem Gaming- sowie IT-Markt neue Impulse zu verleihen und gleichzeitig wertvolles Kundenfeedback zu erhalten.

Gaming als Faktor für Wirtschaft und Politik

Bereits im Vorfeld der Messe war klar, dass die Xbox One X die Gamer in Scharen anlocken würden: Tausende von Fans kamen auf unseren Stand, um einen ersten Blick auf die leistungsstärkste Konsole der Welt zu werfen und natives 4K-Gaming zum ersten Mal selbst zu erleben. Eine Weltpremiere, denn auf der gamescom konnten Gamer erstmalig auf der Xbox One X spielen. Und wirklich: 60 Bilder pro Sekunde mit echter 4K-Auflösung auf der Konsole sind auch für mich eine beeindruckende Erfahrung!

Die Faszination der gamescom macht es aus, dass man Feedback aus erster Hand erhält. Vor Ort wurden wir sowohl im Soft- als auch im Hardwarebereich ausgezeichnet: Die Xbox One X erhielt den gamescom Award in der prestigereichen Kategorie „Best Hardware“, während Forza Motorsport 7 der Preis für das „Best Racing Game“ verliehen wurde. Die spürbare Euphorie auf der Messe wurde beim Vorverkauf der Xbox One X Scorpio Edition bestätigt: Nach weniger als einem Tag war sie bereits restlos ausverkauft.

Noch beeindruckender war zu sehen, dass Gaming mittlerweile in der Mitte der Gesellschaft nicht nur angekommen, sondern fest verankert ist. Glaubt man der Bundeskanzlerin, sind wir alle Gamer! Tatsächlich spielen in Deutschland bereits über 34 Millionen Menschen Computer- oder Videospiele – Tendenz steigend. In Köln stellte Angela Merkel Spielern, Entwicklern und Herstellern künftig eine stärkere staatliche Förderung in Aussicht: „Computer- und Videospiele sind als Kulturgut, als Innovationsmotor und als Wirtschaftsfaktor von allergrößter Bedeutung“. Besonders gerne erinnere ich mich an das ikonische Bild der Kanzlerin neben ihrem Minecraft-Avatar, welches um die ganze Welt gegangen ist.

IFA 2017 zeigt innovative Devices mit Windows 10

Im Messeportfolio von Microsoft ist die IFA als global führende Messe für Consumer Electronics und Home Appliances mittelweile ein wichtiger, internationaler Meilenstein. Dort präsentieren wir unsere Produkthighlights für das bevorstehende Weihnachtsgeschäft und erhalten direktes Feedback von unseren Kunden, welches für die Entwicklung zukünftiger Produkte unabdingbar ist. Außerdem bietet die Messe die Möglichkeit, sich in kürzester Zeit mit den unterschiedlichsten Partnern auszutauschen – von Hardware- bis hin zu Handels-Partnern. Ich war sehr beeindruckt über das breite Device Portfolio unserer OEM-Partner Acer, Dell, Lenovo, HP und ASUS. Aber auch unsere lokalen Device-Partner wie z.B. Trekstor haben ein fantastisches Line-Up für ein erfolgreiches Weihnachtsgeschäft präsentiert.

Neben dem Austausch mit Kunden wie Partnern stehen für mich natürlich auch die aktuellen Tech-Trends der IFA im Fokus. Meine ganz persönlichen Produkt-Highlights in diesem Jahr sind die neuen Windows Mixed Reality Headsets sowie das Surface Studio, das Anwendern völlig neue Wege fürs kreative Arbeiten bietet.

Letzteres sorgte bereits vor dem offiziellen IFA-Start für Aufsehen, denn das Surface Studio wurde von der Computer BILD mit dem „Goldenen Computer“ in der Kategorie „Design“ ausgezeichnet. Stellvertretend für das gesamte Surface Team durfte ich den Preis entgegennehmen – eine besondere Ehre, denn der Preis wurde zum ersten Mal in dieser Kategorie und direkt von den Lesern vergeben. Mittlerweile befindet er sich schon bei den Kollegen in den USA. Das war mir sehr wichtig, denn schließlich war es insbesondere die Arbeit des Chef-Designers Ralf Groene und seines Teams, die diesen Preis erst ermöglicht hat.

Nicht zuletzt wurde unsere Plattform Windows Mixed Reality von dem Magazin Chip zu einem der IFA-Highlights gekürt. Ein schöner Abschluss für ereignisreiche Wochen, die uns in unserem Weg bestärken, weiterhin innovative Hard- und Softwarelösungen auf Basis von Kundenfeedback zu entwickeln. So werden wir weiterhin Produkte auf den Markt bringen, die das Leben einfacher oder produktiver machen und solche, die Begehrlichkeiten wecken, weil sie völlig neue Erlebnisse ermöglichen. Zum Beispiel das Spielen mit 4K auf einer Xbox One X oder das Eintauchen in die Welt von Windows Mixed Reality. Erfahrungen, an denen man einfach teilhaben möchte, weil sie das Leben aufregender und im positiven Sinne intensiver machen.


Ein Beitrag von Thomas Kowollik
Segment Lead Consumer and Device Sales D/A/CH und Mitglied der Geschäftsleitung bei Microsoft Deutschland

Tip of the Week: How to access configuration from Controller in ASP.NET Core 2.0?

$
0
0

You defined your config structure in appsetings.json file, and you what to access this data in a Controller in your ASP.NET MVC or WebAPI project based on .NET Core 2.0. There are a couple of options, but one of the easiest is that.

The example structure of your configuration in appsetings.json file is below.

appsetings.json

[js]{
"MySection": {
"MyFirstConfig": "Secret string",
"MySecondConfig": {
"MyFirstSubConfig": true,
"MySecondSubConfig": 32
}
}
}[/js]

Using built-in support for Dependency Injection, you can inject configuration data to Controller. Use AddSingleton method to add a singleton service in Startup.cs file. Just add services.AddSingleton(Configuration); in ConfigureServices.

Startup.cs

[csharp]public void ConfigureServices(IServiceCollection services)
{
services.AddMvc();
services.AddSingleton<IConfiguration>(Configuration);
}[/csharp]

In your Controller declare IConfiguration variable, and assign configuration in a constructor. To retrieve configuration data at Controller use:

  • _configuration["MySection:MySecondConfig:MyFirstSubConfig"] (each configuration key separated by ":") or
  • _configuration.GetSection("MySection")["MySecondConfig:MySecondSubConfig"] (GetSection method with you section name as variable, and next configuration keys separated by ":" as Index).

Do not forget about using Microsoft.Extensions.Configuration; at the beginning 😉

HomeController.cs

[csharp]using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Configuration;

namespace WebApplication1.Controllers
{
public class HomeController : Controller
{
private IConfiguration _configuration;

public HomeController(IConfiguration configuration)
{
_configuration = configuration;
}

public IActionResult Index()
{
ViewData["MySectionMyFirstConfig"] = _configuration["MySection:MyFirstConfig"];
ViewData["MySectionMySecondConfigMyFirstSubConfig"] = _configuration["MySection:MySecondConfig:MyFirstSubConfig"];
ViewData["MySectionMySecondConfigMySecondSubConfig"] = _configuration.GetSection("MySection")["MySecondConfig:MySecondSubConfig"];
return View();
}
}
}[/csharp]

Example output:

Index.cshtml

[html]<p>
MySectionMyFirstConfig <strong>@ViewData["MySectionMyFirstConfig"]</strong><br />
MySectionMySecondConfigMyFirstSubConfig <strong>@ViewData["MySectionMySecondConfigMyFirstSubConfig"]</strong><br />
MySectionMySecondConfigMySecondSubConfig <strong>@ViewData["MySectionMySecondConfigMySecondSubConfig"]</strong>
</p>[/html]

デジタル マーケティング :新学期商戦で成功するには 【9/9 公開】

$
0
0

(この記事は 2017 年7 月 24 日にMicrosoft Partner Network blog に掲載された記事  Digital Marketers: Cash in on Back-to-School Searches の翻訳です。最新情報についてはリンク元のページをご参照ください。)

 

米国では、新学期を間近に控えるこの時期、大規模なセールが開催され、小売業界にとってはクリスマス シーズンに次ぐ「書き入れ時」となります。Bing Ads の調査 (英語) によると、この時期に大学生が最も購入するのはデジタル機器です。デジタル マーケターの方なら、消費が拡大するこの時期に、データを活用してデジタル マーケティング キャンペーンを実施するのが最も効果的であることをご存じでしょう。この機会に、新学期商戦の顧客層向けにマイクロソフト パートナー様の検索キャンペーンを最適化することをお勧めします。

 

以下に、この大きなセールス タイミングを最大限に利用した広告キャンペーンの構築や最適化に役立つインサイトをまとめました。

 

タイミングがすべて。始めるなら今

これからお伝えする推奨事項は、今すぐに着手されることをお勧めします。ノート PC、タブレット、モバイル デバイスなどの高額なアイテムを購入する場合、消費者は購入前にインターネット上で情報を収集する傾向にあります。Bing Ads の調査 (英語) によると、PC、タブレット、モバイル デバイスに関する検索数は 8 月末から急増し、9 月の最終週にピークを迎えます。

トラフィックの増加にも対応できる十分な予算を割り当てて、クリック報酬型 (PPC: Pay Per Click) のキャンペーンを作成しましょう。昨シーズンのキャンペーンを振り返り、その教訓を新しいキャンペーンに活かすことで、時間や費用を節約できます。また、検索結果が適切に表示されるように文字数や画像をテストします。消費者の購入プロセスの段階に合わせて、広告の内容も調整しましょう。情報収集の段階にいる消費者は、情報量の多い長めの広告文に反応する傾向があります。ただし、購入意思が固まる段階の消費者には、短い広告文の方が効果的です。

 

よく検索される語句を使用する

適切な検索語句を使用することが重要です。マイクロソフトの調査によると、ノート PC やタブレットに関する検索では「HP」「Samsung」「iPad」「Microsoft Tablets」といったブランド名が上位に入っています。スマートフォンの場合は「iPhone」や「Samsung」が頻繁に検索されています。

消費者は質の高い商品を手頃な価格で購入したいと考えています。ノート PC やタブレットに関して、2016 年にブランド名以外で多く検索されたのは「best deals tablets」や「best tablet deal」など、「best deal (お買い得)」という言葉を含む語句でした。スマートフォンやプランに関しては「best smartphones」や「top 10 smartphones」が上位に挙がっています。

これを踏まえて、自社ブランドと競合他社のどちらの語句もキーワードに設定することで、最大限の結果を得られるようにします。シーズンのピークにかけて競争が激化することも予測しながら、入札単価を設定します。また、ロングテールのブランド アイテム名などを活用してもよいでしょう。時間をかけてじっくりと情報を収集する消費者に対しては、広告内にレビューを含めて意思決定を促すのも効果的です。他にも、個人向け Device Finder (英語)教育機関向け Device Finder (英語) などの製品比較ツールに消費者を誘導して、最適な製品を選んでもらえるようにするのも 1 つの方法です。

 

顧客層を把握する

大学の新学期商戦のターゲットは学生だけではありません。学生の保護者や教師も対象となります。そのため、異なるターゲット層を念頭にキャンペーンを組み立てる必要があります。たとえば大学 1 年生なら、経験に基づく学生仲間の意見よりも、保護者の意見に影響を受ける傾向にあります。同じく、学生の住む地域にも注目するとよいでしょう。

また、年齢や性別で購買層を分類し、ターゲットの検索画面に広告が表示される回数が増えるように入札単価を調整します。細かくキャンペーンを設定できる機能を十二分に活用して、性別や年齢などのデモグラフィック属性を基にターゲットを絞り込みましょう。

さらに、ショッピング カートの中身をキャンセルしたことのある訪問客も非常に重要と考えます。北米におけるショッピング カートのキャンセル率は 74% に上ります。こうした人々は今後購入する可能性が非常に高いため、リマーケティングを実施して、もう一度サイトに誘導するのがベストです。

 

最終的な購買方法を把握する

Bing Ads の調査 (英語) によると、デジタル機器の購入者の 43% は、ネット上で情報を集めた後に、実店舗まで足を運んで購入しています。ネット上で大量の情報収集が行われているにもかかわらず、デジタル機器全体のうち、実店舗での販売数は 61% にも上ります。情報収集から購入までの工程をすべてオンラインで済ませる割合は、わずか 14% にとどまっています。

そのため、オンライン販売ばかりに気を取られていると大局を見失いかねません。デモグラフィック属性によって細かくターゲットを設定し、消費者を地域の実店舗とオンライン販売サイトへ誘導しましょう。複数のメディアでデジタル キャンペーンを同期させることで、効果を最大限に引き出せます。

新学期へ向けてデジタル機器の需要が増える今こそ、デジタル広告を活用する絶好のタイミングです。オンライン検索や広告に関するより詳細なインサイトについては、Bing Ads の調査レポート (英語) をダウンロードしてご覧ください。新学期商戦に役立つヒントが満載です。

 

デジタル マーケティング キャンペーンに有料広告サービスを利用しているパートナー様は、ぜひこちら (英語) のマイクロソフト パートナー コミュニティにご意見をお寄せください。

 

 

 

 

 

 

 

IFA 2017 Talks: Greg Sullivan erklärt die Welt von Windows Mixed Reality

$
0
0

Die IFA 2017 ist nicht nur die global führende Messe für Consumer Electronics und Home Appliances, sondern auch der Ort, um neue Tech Trends zu präsentieren. In unserem Microsoft IFA Studio haben wir uns mit verschiedenen Gästen über genau diese Trends unterhalten. Herausgekommen sind vier Stunden Live-Talk, umfassende Informationen zu den neuesten Produkten und unzählige Impressionen von der wichtigsten Consumer-Messe in Berlin.

Mixed Reality zählte zu den wichtigsten Themen auf der diesjährigen IFA. Denn das im Rahmen unserer Keynote für den 17. Oktober angekündigte Windows 10 Fall Creators Update unterstützt entsprechende Windows Mixed Reality Headsets, die gleichzeitig von unseren OEM Partnern auf der Messe vorgestellt wurden. Im Rahmen unserer IFA Talks haben wir deshalb mit Greg Sullivan, Director of Communications Mixed Reality bei Microsoft, gesprochen.

Dabei erklärte er zunächst die Bedeutung der verschiedenen Begrifflichkeiten – Mixed Reality, Virtual Reality und Augmented Reality. Mixed Reality umfasst das gesamte Spektrum von der physischen bis hin zur vollständig digitalen Welt. An einem Ende der Spannweite ist dabei Augmented Reality angesiedelt, bei der digitale Elemente in die physische Welt integriert werden – wie beispielsweise mit Microsoft HoloLens. Am anderen Ende des Spektrums befindet sich die Virtual Reality, eine in erster Linie digitale Sicht auf die Realität, in die Elemente der physischen Welt miteinbezogen werden, wie zum Beispiel Handbewegungen mithilfe von Controllern. Windows Mixed Reality ist die Plattform für das gesamte Spektrum, sodass Geräte jeder Art damit nutzbar sind und sogar interoperieren können.

Darüber hinaus erläutert Greg Sullivan den Unterschied zwischen Microsoft HoloLens und den neu vorgestellten Headsets der Partner Acer, ASUS, Dell und Lenovo. Letztere sind mit einem Windows PC verbunden und bieten die Möglichkeit, den 2D-Desktop zu einer 3D-Welt zu erweitern. So können Windows Apps des PCs auch mit den Windows Mixed Reality Headsets genutzt, Spiele zu einem immersiven Erlebnis erweitert und 360 Grad-Videos angeschaut werden. Die Microsoft HoloLens hingegen ist der weltweit erste und bisher einzige in sich geschlossene holographische Computer, der keine Verbindung zu einem anderen PC benötigt, um mithilfe von Windows Mixed Reality Hologramme in die physische Realität projizieren zu können.

Aber wer sind die Nutzer der Windows Mixed Reality Headsets und was ist ihre Zukunft? Aktuell werden die Devices hauptsächlich zu Gaming- und Entertainment-Zwecken genutzt. Greg Sullivan sieht jedoch noch weitere mögliche Anwendungsszenarien wie beispielsweise eine völlig neue Art der Visualisierung von Daten. Außerdem stellt er einen weiteren wesentlichen Faktor heraus: Auf Basis der Windows Plattform können Entwickler Anwendungen jeder erdenklichen Art kreieren – ganz nach dem Motto „The Sky is the Limit“. Das Potenzial der Headsets ist damit unendlich groß.

Hätte Greg Sullivan die Chance, seine ganz persönliche App für Windows Mixed Reality entwickeln zu dürfen, wäre dies übrigens eine Anwendung im Bereich Remote Collaboration. Auf diese Weise könnte man via Hologramm gemeinsam mit der Familie Konzerte, Sportevents oder Ähnliches anschauen und sich dabei austauschen ohne am selben Ort zu sein.


Ein Beitrag von Sydney Loerch
PR/Communications Intern

SQL Server Database Objects and Dependencies Report

$
0
0

SQL Server exposes very much information using Dynamic Management Views and System Catalog Views. I like the idea of self-documenting systems. Manually maintained documentation tends to be outdated very fast and therefore is obsolete in many cases. What I've missed, is a report visualizing the Database Objects and Dependencies at a glance. Here we go! The result may be used in addition to the self-documenting Report Environment.

The idea is, to see the objects of the current database with their dependencies and where they are referenced by others.

Additionally, I'd like to control the type of dependency: all, only cross-schema, only cross-database or only cross-server

The result should also be neat and handy to be usable directly within SQL Server Management Studio as Custom Report with no further requirements. And on top, it should work for a current SQL Server Reporting Services Instance (2016) and for SSRS 2012.

How the result should look like

After a parameter pane, an overview with the dependencies and detailed objects and dependencies are listed:

 

Report Design

Basic Design

The basic design for the report consists of:

  • Headline
  • Parameter Pane, showing the three report parameters in the report result, and usable as toggle-items to refresh the report with a different parameter set
  • Dependencies Overview, a matrix of the result, filtered for rows containing dependencies
  • Details, starting with Column-Toggle-Items (Table bound to Dummy-Dataset) and the actual Detail-Table

Design View:

Parameter Pane

The parameter pane is based on the Dummy-Dataset, rendering all rows as header-rows.

Cells containing the currently selected parameter are not rendered as links using a dynamic report action, e.g. "all"-Cell for ShowDependencies:

=IIf(Parameters!ShowDependencies.Value = "all", Nothing, Parameters!ReportName.Value)

The background color for cells containing the currently selected parameter is set to grey, e.g. cells for ShowDependencies:

=IIf(Parameters!ShowDependencies.Label = me.Value, "LightGrey", "White")

Overview

The overview is a matrix for all rows containing references, using the following filter condition:

<Filters>
	<Filter>
	  <FilterExpression>=Fields!ReferencedObjectName.Value</FilterExpression>
	  <Operator>NotEqual</Operator>
	  <FilterValues>
		<FilterValue>=Nothing</FilterValue>
	  </FilterValues>
	</Filter>
</Filters>

Details

The details are listed as grouped table containing object details and further columns for References and Referenced By:

  • Column-Visibility could be toggled with textboxes of separate table
  • 2nd Header-Row of main-table also contains a sub-table for each column with the corresponding row-headers
  • Bookmark-Actions for objects mentioned as reference or referenced-by to enable direct navigation within report

Toggle-Items:

  • Initially, the rows for all non-dbo-Schema of the current database are shown, each other rows are collapsed by default.
  • The reference-column is visible by default.
  • The referencedBy-column isn't visible by default.

Report Data Elements

The report is based on a single Data Source and some Report Parameter and Datasets.

Overview

Report Parameter

  1. ShowDependencies, filters dependencies: all / Cross-Schema / Cross-Database / Cross-Server
  2. FilterReferencing: all / with / without
  3. FilterReferencedBy: all / with / without
  4. ReportName: hidden, used for report actions, executing the current report with a different parameter se
    When executed as SSMS Custom Report, the value for Globals!ReportName is not set. Therefore, this parameter is used as single point of calculation for all report actions:
=IIf(Globals!ReportName = "", "SSRS-DatabaseObjectsAndDependencies", Globals!ReportName)

Getting the data

I'd like to look at several aspects to gather the information for this report. The overall query is to big to be a reasonable part of this post - take a look at the report definition to get the full query.

Getting the details for foreign keys

The System Catalog View sys.foreign_keys contain the foreign keys of the current database:

SELECT
	-- source
	@@SERVERNAME AS SourceServerName,
	DB_NAME() AS SourceDatabaseName,
	OBJECT_SCHEMA_NAME(src.object_id) AS SourceSchemaName,
	src.[type] AS SourceObjectType,
	src.type_desc AS SourceObjectTypeDesc,
	src.name AS SourceObjectName,

	-- destination
	@@SERVERNAME AS DestinationServerName,
	DB_NAME() AS DestinationDatabaseName,
	OBJECT_SCHEMA_NAME(dest.object_id) AS DestinationSchemaName,
	dest.[type] AS DestinationObjectType,
	dest.type_desc AS DestinationObjectTypeDesc,
	dest.name AS DestinationObjectName
FROM sys.foreign_keys AS k
-- Foreign Keys are only within same database
LEFT JOIN sys.objects AS src ON src.object_id = k.parent_object_id
LEFT JOIN sys.objects AS dest ON dest.object_id = k.referenced_object_id
-- Hide self-references
WHERE k.parent_object_id <> k.referenced_object_id

Getting the details for SQL references

SQL Server exposes the dependency information, e.g. of stored procedures or views in a System Catalog View sys.sql_expression_dependencies. References within the same database are there, but also references to other databases or even to remote objects. This may also contain references to obsolete objects, which no longer exist.

SELECT
	-- source
	@@SERVERNAME AS SourceServerName,
	DB_NAME() AS SourceDatabaseName,
	OBJECT_SCHEMA_NAME(src.object_id) AS SourceSchemaName,
	src.[type] AS SourceObjectType,
	src.type_desc AS SourceObjectTypeDesc,
	src.name AS SourceObjectName,

	-- destination
	ISNULL(r.referenced_server_name, @@SERVERNAME) AS DestinationServerName,
	ISNULL(r.referenced_database_name, DB_NAME()) AS DestinationDatabaseName,
	ISNULL(r.referenced_schema_name, OBJECT_SCHEMA_NAME(r.referenced_id)) AS DestinationSchemaName,
	dest.[type] AS DestinationObjectType,
	dest.type_desc AS DestinationObjectTypeDesc,
	r.referenced_entity_name AS DestinationObjectName
FROM sys.objects AS src
INNER JOIN sys.sql_expression_dependencies AS r ON r.referencing_id = src.object_id
LEFT JOIN sys.objects AS dest ON dest.object_id = r.referenced_id
WHERE
	-- Hide references to temporary objects within Stored Procedure
	NOT (src.type_desc = 'SQL_STORED_PROCEDURE' AND r.referenced_id IS NULL AND r.referenced_schema_name IS NULL)

Getting the details for Objects

The objects of the current database are exposed in the System Catalog View sys.objects. The common name of table types may differ and therefore the one of sys.table_types is used.

SELECT
	@@SERVERNAME AS ServerName,
	DB_NAME() AS DatabaseName,
	OBJECT_SCHEMA_NAME(src.object_id) AS SchemaName,
	src.[type] AS ObjectType,
	src.type_desc AS ObjectTypeDesc,
	ISNULL(srcTableType.name, src.name) AS ObjectName
FROM sys.objects AS src
-- in case of Table Types, name in sys.objects differs from actual name in sy.table_types
LEFT JOIN sys.table_types AS srcTableType ON srcTableType.type_table_object_id = src.object_id

Overall process

Steps to get the overall result:

  1. The data of all known dependencies in the current database for foreign keys and SQL references are gathered.
  2. The filtered result using parameter @ShowDependencies (all / Cross-Schema / Cross-Database / Cross-Server) is stored in a temporary table #AllReferences.
  3. The object details for all objects of the current database are stored in a temporary table #AllObjects.
  4. Objects only mentioned in dependencies, based on #AllReferences, are added to #AllObjects.
  5. Objects of the following type are not listed in the result itself.
    In case of constraints, they're in the result as dependency information for their tables without mentioning the name of the actual constraint: No "FK_xyz_abc" in the result, but a reference of table xyz to table abc and vice versa as referenced-by.
  6. Building up the overall result, combining the data of #AllObjects and #AllReferences to a single table.

Columns of the overall result including object itself, dependencies as Referenced..., dependent objects as ReferencedBy...:

  • ServerName
  • DatabaseName
  • SchemaName
  • ObjectType
  • ObjectTypeDesc
  • ObjectName
  • ReferencedServerName
  • ReferencedDatabaseName
  • ReferencedSchemaName
  • ReferencedObjectType
  • ReferencedObjectTypeDesc
  • ReferencedObjectName
  • ReferencedByServerName
  • ReferencedByDatabaseName
  • ReferencedBySchemaName
  • ReferencedByObjectType
  • ReferencedByObjectTypeDesc
  • ReferencedByObjectName

Constraints

Play by the rules: Reports for SQL Server Management Studio (SSMS)

In order to get the report executable also directly in SSMS, it may not reference Shared DataSources, Shared DataSets and Subreport, also Parameter-Handling is constrained. Therefore, the report does all the work itself, not referencing anything else but a database, which objects and dependencies are analyzed. Also the parameter handling is done in the result using a custom parameter pane and report actions executing the current report with different parameter set.

Differences between Report Versions

There are only very few differences between the Report Definitions for SSRS 2008R2 (including 2012 / 2014) and SSRS 2016:

  1. Schema of RDL-File
    • 2008R2: http://schemas.microsoft.com/sqlserver/reporting/2010/01/reportdefinition
    • 2016: http://schemas.microsoft.com/sqlserver/reporting/2016/01/reportdefinition
  2. ReportParametersLayout
    With SSRS 2016, the layout of the parameters is customizable. Within the RDL-File, this is stored as ReportParametersLayout-Element.

The SSRS 2016 Report may also work with newer versions - as long as the schema doesn't change.

The Query itself is based on System Catalog Views, available in SQL Server >= 2008 and Azure SQL Database.

Conclusion

The result became very complex, but in the end, it fulfills the purpose. Knowing the parts and how they fit together, the overall picture should be complex, but no longer complicated. Using this report, you should be able to easily analyze and document database dependencies.

There may be further extensions to this report, not only to list objects and dependencies of the current database but of all databases of a server - e.g. using the very famous but also very shy and therefore officially undocumented stored procedure sp_MSforeachdb. Doing so may blow up the interoperability with Azure SQL Database. But there is also a way to work around this, resulting in a compatible version.

Do you feel challenged? It's up to you! Start going beyond and please share your insights! 🙂

Download

Further Reading

Top Contributors Awards! September’2017 Week 1

$
0
0

Welcome back for another analysis of contributions to TechNet Wiki over the last week.

First up, the weekly leader board snapshot...

 

As always, here are the results of another weekly crawl over the updated articles feed.

 

Ninja Award Most Revisions Award
Who has made the most individual revisions

 

#1 M.Vignesh with 128 revisions.

 

#2 Ken Cenerelli with 119 revisions.

 

#3 Kapil.Kumawat with 72 revisions.

 

Just behind the winners but also worth a mention are:

 

#4 Arleta Wanat with 45 revisions.

 

#5 Sabah Shariq with 39 revisions.

 

#6 Nourdine MHOUMADI with 33 revisions.

 

#7 Maruthachalam with 29 revisions.

 

#8 RajeeshMenoth with 22 revisions.

 

#9 ??????? with 20 revisions.

 

#10 Mandar Dharmadhikari with 13 revisions.

 

 

Ninja Award Most Articles Updated Award
Who has updated the most articles

 

#1 M.Vignesh with 78 articles.

 

#2 Ken Cenerelli with 61 articles.

 

#3 Arleta Wanat with 25 articles.

 

Just behind the winners but also worth a mention are:

 

#4 Kapil.Kumawat with 16 articles.

 

#5 RajeeshMenoth with 13 articles.

 

#6 Richard Mueller with 11 articles.

 

#7 ??????? with 10 articles.

 

#8 Maruthachalam with 6 articles.

 

#9 Peter Geelen with 3 articles.

 

#10 Deva [MSFT] with 3 articles.

 

 

Ninja Award Most Updated Article Award
Largest amount of updated content in a single article

 

The article to have the most change this week was DNS e seus mundos de Troubleshooting - Windows Server 2008/2012/2016 - Parte4 (Teoria e Prática), by FÁBIOFOL

This week's revisers were Kapil.Kumawat & M.Vignesh

Do you want to know about DNS Zone Configuration, read this article, very important and useful.

 

Ninja Award Longest Article Award
Biggest article updated this week

 

This week's largest document to get some attention is Beginners Guide to implement AJAX CRUD Operations using JQuery DataTables in ASP.NET MVC 5, by Ehsan Sajjad

This week's revisers were RajeeshMenoth, ???????, M.Vignesh, Kapil.Kumawat & Carsten Siemens

Wow, what an article, how to implement AJAX CRUD operations for a particular entity using JQuery DataTables in ASP.NET MVC 5. Nice article Ehsan 🙂

 

Ninja Award Most Revised Article Award
Article with the most revisions in a week

 

This week's most fiddled with article is Licenciamento Windows Server X VMware, by Marcos Roberto de Lima. It was revised 33 times last week.

This week's revisers were M.Vignesh, Marcos Roberto de Lima & Waqas Sarwar(MVP)

Know about Licensing Windows Server X VMware. Informative 🙂

 

Ninja Award Most Popular Article Award
Collaboration is the name of the game!

 

The article to be updated by the most people this week is TechNet Guru Competitions - September 2017, by Peter Geelen

Gurus where are you?? 10 days and 9 articles so far till date in all categories. Slow Start 🙁 Go Go Gurus!!

This week's revisers were C Sharp Conner, SYEDSHANU - MVP, Kapil.Kumawat, Mandar Dharmadhikari, Rahul_Dagar, Abhishek0127[Abhishek kumar], Mustafa Toroman & Bhushan Gawale

 

The article to be updated by the most people this week is SharePoint 2016 - Legacy Site Detected, by Rahul_Dagar

If you are getting error 'Legacy Site Detected' in SharePoint 2016, check this article. Solution provided by Rahul. Good work 🙂

This week's revisers were M.Vignesh, Sabah Shariq, Ken Cenerelli, Kapil.Kumawat, Waqas Sarwar(MVP) & Rahul_Dagar

 

Ninja Award Ninja Edit Award
A ninja needs lightning fast reactions!

 

Below is a list of this week's fastest ninja edits. That's an edit to an article after another person

 

Ninja Award Winner Summary
Let's celebrate our winners!

 

Below are a few statistics on this week's award winners.

Most Revisions Award Winner
The reviser is the winner of this category.

M.Vignesh

M.Vignesh has won 16 previous Top Contributor Awards. Most recent five shown below:

M.Vignesh has not yet had any interviews, featured articles or TechNet Guru medals (see below)

M.Vignesh's profile page

Most Articles Award Winner
The reviser is the winner of this category.

M.Vignesh

M.Vignesh is mentioned above.

Most Updated Article Award Winner
The author is the winner, as it is their article that has had the changes.

FÁBIOFOL

FABIOFOL has won 2 previous Top Contributor Awards:

FABIOFOL has not yet had any interviews, featured articles or TechNet Guru medals (see below)

FABIOFOL's profile page

Longest Article Award Winner
The author is the winner, as it is their article that is so long!

Ehsan Sajjad

Ehsan Sajjad has won 3 previous Top Contributor Awards:

Ehsan Sajjad has TechNet Guru medals, for the following articles:

Ehsan Sajjad has not yet had any interviews or featured articles (see below)

Ehsan Sajjad's profile page

Most Revised Article Winner
The author is the winner, as it is their article that has ben changed the most

Marcos Roberto de Lima

This is the first Top Contributors award for Marcos Roberto de Lima on TechNet Wiki! Congratulations Marcos Roberto de Lima!

Marcos Roberto de Lima has not yet had any interviews, featured articles or TechNet Guru medals (see below)

Marcos Roberto de Lima's profile page

Most Popular Article Winner
The author is the winner, as it is their article that has had the most attention.

Peter Geelen

Peter Geelen has been interviewed on TechNet Wiki!

Peter Geelen has featured articles on TechNet Wiki!

Peter Geelen has won 185 previous Top Contributor Awards. Most recent five shown below:

Peter Geelen has TechNet Guru medals, for the following articles:

Peter Geelen's profile page

 

Rahul_Dagar

This is the first Top Contributors award for Rahul_Dagar on TechNet Wiki! Congratulations Rahul_Dagar!

Rahul_Dagar has not yet had any interviews, featured articles or TechNet Guru medals (see below)

Rahul_Dagar's profile page

Ninja Edit Award Winner
The author is the reviser, for it is their hand that is quickest!

Abhishek0127[Abhishek kumar]

Abhishek0127[Abhishek kumar] has been interviewed on TechNet Wiki!

Abhishek0127[Abhishek kumar] has won 3 previous Top Contributor Awards:

Abhishek0127[Abhishek kumar] has TechNet Guru medals, for the following articles:

Abhishek0127[Abhishek kumar] has not yet had any featured articles (see below)

Abhishek0127[Abhishek kumar]'s profile page

 

Another great week from all in our community! Thank you all for so much great literature for us to read this week!
Please keep reading and contributing!

 

Best regards,
— Ninja [Kamlesh Kumar]

 

 パートナー様による慈善活動 :Capstone IT の取り組み【9/10 公開】

$
0
0

(この記事は 2017 年 7 月 25 日にMicrosoft Partner Network blog に掲載された記事 Partners Doing Good: Capstone IT の翻訳です。最新情報についてはリンク元のページをご参照ください。)

 

マイクロソフト パートナー様の慈善活動の基礎となるのは、地域社会とのつながりです。マイクロソフトでは、地元への還元や支援に取り組むことによって、企業の存在感をアピールし、地域での成功につなげている例をいくつも目にしています。今回は、マイクロソフト パートナー様の Capstone IT (英語) が、地元ニューヨーク州ロチェスターに貢献している取り組みをご紹介します。

 

 

きっかけはコミュニティから

Capstone IT の共同 CEO を務める Sitima Fowler 氏は、慈善活動は楽しみであり、また義務であると感じています。同社はロチェスターの SMB 市場向けマネージド IT サービス プロバイダーで、現在はフロリダ州のウェストパームビーチにも事業を展開しています。Office 365 への移行、IT サポート、セキュリティ管理をセットにした、シンプルで手頃な価格のサービス パッケージを提供しています。ロチェスターは米国で 2 番目に貧困率が高い都市であり、Sitima 氏と、彼女の夫で共同 CEO の Mike Fowler 氏は、コミュニティを支援する必要性を以前から感じていました。

「地元の中小企業にサービスを提供している当社は、この狭い市場に密接にかかわっています。これまで順調にビジネスを拡大してこられたのは、コミュニティのおかげです。従業員が暮らすこの地域が私たちのホームなのです。お世話になっているコミュニティに、ぜひ還元したいと思いました」

– Capstone IT、共同 CEOSitima Fowler

Sitima 氏は、地域に対する情熱が活動を始めたきっかけであると話しています。Capstone IT は複数の非営利団体を支援し、年間の寄付プログラムに参加しています。Sitima 氏と Mike 氏は、周囲の慈善活動に感銘を受けていたときに、尊敬する人物に慈善活動への参加を勧められました。Sitima 氏はこう言います。「私たちは地元を大切にしています。なぜなら、私たちはここに深く根差した会社であり、この地域の活性化と繁栄を心から願っているからです」

 

 

参加する

Capstone IT は地域のさまざまなグループやプログラムをサポートしており、地元の若者やホームレスに機会や物資の提供を行っています。2 人の CEO は、役員として非営利団体に直接かかわり、ボランティアや資金集めを行っています。特に興味深いのが「Adopt-A-Classroom」プログラムです。このプログラムは、近隣の学校の生徒たちをオフィスに招待し、将来に役立つツールやテクノロジを学んでもらうというものです。生徒たちは実際に Skype for Business や PowerPoint などのソフトウェアに触れ、エンジニアがコンピューターを分解するところも見学します。

Capstone IT では、従業員が楽しめる活動として、創業 10 周年の記念に 10 か月間の寄付運動を展開したこともあります。この運動はとても評判が良かったため、1 年に延長することになりました。また、従業員は 1 年に 8 時間分のボランティア休暇が与えられ、好きなチャリティ イベントに参加することができます。このほかにも、同社はさまざまなプログラムや活動に参加しています。

 

 

Bivona Child Advocacy Center

Sitima 氏は、Bivona Child Advocacy Center (英語) の役員に推薦されたものの、しばらく保留していたと言います。Bivona は、虐待を受けた子供たちが生活を立て直すための支援を行う団体ですが、10 代の娘を持つ彼女にとって、その役目は荷が重すぎると感じていたのです。Bivona はコラボレーション支援サービス、意識教育、コミュニティでの率先した活動を通じて、虐待のトラウマを抱える子供を支援し保護するプロセスを効率化しています。

Bivona の活動に参加することは、1 人の母親として身につまされるものがありましたが、何度も団体とかかわりを持つにつれて、Sitima 氏はこれが自分の使命であるとはっきりと感じるようになりました。そして 2014 年 5 月から役員を務めています。

 

East House

Mike 氏は、East House (英語) という非営利団体をサポートしています。この団体は、慢性的な精神疾患や薬物依存症に悩む人が、独立して充実した生活を送れるように後押しするというミッションを掲げ、支援ホーム、治療コーディネート、教育プログラム、雇用プログラムなどのサービスを提供して社会復帰を促しています。また、社会復帰に必要な機会やツールなども提供しています。Mike 氏は 2013 年からこの団体の役員を務めています。

 

RAIHN

Sitima 氏は、栄養のある食事や身の回りの世話を行う Rochester Area Interfaith Hospitality Network (RAIHN、英語) の活動に特に熱心に取り組んでいます。RAIHN は宗教や文化的な背景に配慮した非営利団体で、ホームレスの家族が持続的に独立するためのシェルターや食事、医療の提供を行っています。Sitima 氏の家族がイベントを担当する回では、全力で取り組んでいます。支援が必要な人々に思いやりや愛情が伝わるよう、ごちそうを用意し、音楽やろうそくの灯りで精いっぱいもてなします。食事を食べてもらうだけでなく、安心感や幸福感を分かち合いたいと Sitima 氏は語っています。

「たとえ一晩だけであっても、住む家がない人々の苦労や負担が軽くなればと思っています」

– Capstone IT、共同 CEOSitima Fowler

 

 

還元の重要性

Sitima 氏は Capstone IT の慈善活動は、自身にとってもビジネスにおいても重要であると感じています。なぜなら、Capstone IT の顧客は、ロチェスターのコミュニティの人々やその家族だからです。また、コミュニティは自身や従業員の家族が豊かに暮らすための基盤となります。だからこそ、そこに貢献することが重要なのです。

ロチェスターの経済状況は良いとは言えません。貧困率や失業率が高く、かつて潤っていた産業も不振に陥り、地元の企業が以前ほど地域社会へ還元できなくなっています。コダックやゼロックスといった大手企業は、以前はロチェスター経済の要でしたが、現在は事業規模を縮小しています。Sitima 氏は成功を収めている企業は、社会に還元する責任があると考えています。「世界中の飢餓をなくすことはできませんが、できるだけ自分の役目を果たしたいと思います」

成長を続ける Capstone IT は「Great Place to Work (働きがいのある会社)」に認定されています。慈善活動を会社の理念に組み入れたことで、従業員の定着率も高まりました。Sitima 氏と Mike 氏は現在、同社の価値の実現に向けて尽力しています。長期的な目標として、現在行っている慈善活動や非営利団体の支援を継続しつつ、ビジネスを成長させて慈善活動を拡大できるようにすることを目指しています。Sitima 氏は、インドの地方で起業家育成プログラムを立ち上げるという夢を持っています。いつか自分が学んだビジネスの経験を活かして、その知識を均等な機会に恵まれない人々に伝えたいそうです。

 

地域社会への還元について、マイクロソフト パートナー コミュニティ (英語) にご参加いただき、ぜひ他のパートナー様と意見を交換してださい。

 

 

 

 


Ignite 2017 – Windows 10 Deployment And Management Sessions

$
0
0

It’s almost that time of year, Microsoft’s Ignite event takes place in September, and here are a few of the sessions that focus on Windows 10 to help give you an idea of which ones you might want to consider attending, or alternatively, and more likely for the majority, watch online after the fact. As these posts continue there will most likely be overlap between technologies, so I will try to avoid posting the same recommendation twice.

Windows 10 servicing explained (WAAS): Deploying Windows as an inplace upgrade

This session lets you understand the why and how to implement Windows as a service. Learn how to secure your deployment investment by deploying Windows as an inplace upgrade instead of wipe and load builds. Adnan clears the confusion between ideology and termininology.

The core value of Windows 10 apps in the enterprise

Traditional Win32 applications present numerous challenges and complications for enterprises. This session dives into how your company can benefit from Windows 10 packaged apps and the Universal Windows Platform. Packaged apps are the foundation of modern app management in Windows 10. We discuss the key design elements of a packaged app and how it provides value to enterprises. We show how packaged apps improve app deployment and manageability in an enterprise. We also provide a roadmap for how you can migrate the line-of-business apps in your enterprise to package apps.

Servicing Windows 10 in the real world

After over two years of deploying and managing Windows 10, Johan Arwidmark and Mikael Nystrom have a rich set of tips and tricks. The background? Once Windows 10 is deployed, it needs your attention. You need to maintain it, a process commonly named as servicing. In this session, learn how to deploy feature upgrades to existing Windows 10 machines. There are many tools with which IT pros can service Windows as a service.

Windows Insider Program 101

This session provides an intro to the Windows Insider Program (WIP) including WIP for Business. It is a prerequisite to Dona Sarkar's session

How Microsoft IT deploys Windows 10 and implements Windows as a service

Learn how Microsoft IT adopted and deployed Windows 10 internally using Enterprise Upgrade as the primary deployment method. This approach reduced the deployment overhead by using System Center Configuration Manager Operating System Deployment (OSD) and upgrade which resulted in significant reductions in helpdesk calls. In addition we share how we are leveraging some of the new enterprise scenarios to delight users while securing the enterprise. You can realize similar benefits in your enterprise by adopting these best practices.

Secure Windows 10 with Intune, Azure AD and System Center Configuration Manager

Microsoft offers a deep bench of security techologies but how can they be deployed and configured in the real world. In this session we show you how Intune, Azure Active Directory and System Center Configuration Manager can be used to configure Windows 10 devices for maximum security.

Mastering the lions PAW: How to build a privileged access workstation

In this session, learn how you build a secure way of managing an enterprise. Since 2000 you were not supposed to logon to servers with RDP, but most people still do. This is not the way to manage Windows! This session offers you three different ways/levels of managing servers from what’s known as a “Privileged Access Workstation.” Believe Jeffrey Snover when he says you should not put your servers one by one with RDP but with RSAT and PowerShell.

Ask the Experts: Windows 10 deployment and servicing

Got questions about Windows 10 deployment and servicing topics? Bring your toughest questions to fire at a panel of experts. Even if you don't have any questions yourself, come learn from the conversations.

Servicing Windows 10: Understanding the Windows as a service process and improvements

Windows 10 makes significant changes to the way Windows is deployed and kept up to date; this new process is called "Windows as a service." In this session, we explore what this means, including concepts, terminology, and processes. We also review recent improvements that have been made, and look at the roadmap forward.

Deploying Windows 10 in the enterprise using traditional and modern techniques

Windows 10 opens up new deployment options, while continuing to support traditional image-based techniques. In this pre-day training session, we dive into both, so that you understand the changes that Windows 10 introduces for traditional OS deployment (including building, customizing, and deploying images with MDT and System Center Configuration Manager), as well as new modern deployment options available with Azure Active Directory and Mobile Device Management (MDM) tools, provisioning packages, and more. We also talk about Windows Analytics.

Windows Insider Business Program: A perfect match for companies (repeat)

Welcome to the Windows Insider for Business Program (WIB) best practice session. The speaker talks about the program and its use in different companies. He talks about the added values for companies and the cooperation with the Windows Insider Team. Use the advantage of WIB to be a hero of your company.

Reducing the network impact of Windows 10 feature and quality updates

A common concern with Microsoft Windows 10 and Windows as a service has to do with the size of both monthly quality updates and the less-frequent feature updates. We have a variety of technologies that can help reduce the impact on an organization's network and core infrastructure. In this session, we review those technologies and show how to use them to minimize the overall impact.

Expert-level Windows 10 deployment

In the session we are taking OS Deployment in Microsoft Deployment Toolkit and System Center Configuration Manager to its outer limits. Deployment tips, tricks, and hardcore debugging in a single session from industry experts. You can expect a lot of live demos in this session.

Learn how to service Windows 10 using Windows Update for Business

Learn the best practices and guidance on how to service Windows 10 leveraging Windows Update for Business platform. In this session we will cover Microsoft guidance on deployment rings, servicing control offered by Windows Update for Business, brief introduction to Delivery Optimization (peer-peer caching) technology used by Windows Update for Business, brief introduction to Windows Analytics Update Compliance reporting used by Windows Update for Business and integration of Windows Update for Business with management solutions.

Servicing Windows 10 in the real world

After over two years of deploying and managing Windows 10, Johan Arwidmark and Mikael Nystrom have a rich set of tips and tricks. The background? Once Windows 10 is deployed, it needs your attention. You need to maintain it, a process commonly named as servicing. In this session, learn how to deploy feature upgrades to existing Windows 10 machines. There are many tools with which IT pros can service Windows as a service.

Top ten reasons to use Windows 10 Current Branch vs. Long Term Servicing Branch

For many organizations, moving to the Windows 10 Current Branch twice-a-year release cycle may require signifcant change across their entire IT organziation's process, tools, and people (e.g., IT support, client team, application owners, security team, etc.) Often, the Long Term Servicing Branch (LTSB) is seen as a possible nirvana to all of the questions that Windows 10 Current Branch raises. In this session, we discuss how LTSB has a valid role in many organizations, but clarify why the Current Branch should be used in a majority of customers.

Ignite 2017 – Modern IT with Windows 10 Sessions

$
0
0

It’s almost that time of year, Microsoft’s Ignite event takes place in September, and here are a few of the sessions that focus on Windows 10 to help give you an idea of which ones you might want to consider attending, or alternatively, and more likely for the majority, watch online after the fact. As these posts continue there will most likely be overlap between technologies, so I will try to avoid posting the same recommendation twice. That first session looked so good it's getting repeated!

Windows 10 management with Microsoft 365 Business (Repeated)

In this session, learn what Microsoft 365 Business brings to Windows users and devices, getting a deeper understanding of how Azure Active Directory and Microsoft Intune work together to enable upgrade and management scenarios.

Make Windows devices more secure by taking them out of your existing infrastructure

Can an Internet connected device joined to Azure Active Directory really be safer for a company than a traditional domain joined PC? Come and find out how in this session. Learn how to get the most out of an Enterprise Mobility + Security (EMS) subscription. Learn also to how maximize device and data security with Windows 10 devices, bringing into the mix EMS and Office 365 for a ‘better together’ strategy.

Why WCD is wicked for modern deployment

The Windows Configuration Designer (WCD) tool allows you to transform off-the-shelf devices rapidly, helping you to bring agility and flexibility to your deployment process.

Windows 10 and the cloud: Why the future needs hybrid solutions

Cloud services have become firmly established in the working day of many companies. Almost everywhere, initiatives or projects are in progress that deal with the workplace of the future. Windows 10, Intune and Azure Active Directory open up new opportunities for cloud-based management, authentication, and administration. Scenarios such as BYOD and COPE let companies think about how users access business resources and apps.

Architect a modern and secure desktop for your organization

Windows 10 and Office 365 ProPlus are optimized for the cloud and deliver value to enterprise organizations beyond any competing client. Learn how to architect end-to-end solutions for deployment, protection, and change management of your modern desktop, including Microsoft Enterprise Mobility + Security (EMS). Learn about the new update model for client updates including types of releases and cadence for both Windows 10 and Office 365 ProPlus. Test

Transition to cloud-based management of Windows 10 and Office 365 ProPlus with EMS

Are you looking to transition the management of Windows 10 and Office 365 ProPlus to the new agentless approach, but don’t know where to start? Join this session to see how you can do this at your own pace with System Center Configuration Manager, Microsoft Intune, Azure Active Directory, and other cloud services.

New modern management features for IT pros

Windows 10 is designed for Modern IT. In this session, we talk about how the different ways you can modernize IT management to get the most out of Windows 10 devices. This session is presented to you by the Windows modern management team alongside Jeremy Moskowitz, 15-year Group Policy and MDM MVP. The session covers the paths we see organizations adopt to move to modern management, when to use modern management, challenges we see and how we recommend addressing them.

Manage Windows devices in the complex hybrid cloud world of today

IT departments today have more challenges than ever when it comes to securing and managing devices, especially when so many are mobile. Check out your options as we evaluate on-premises and cloud-based Microsoft tools to see which works best for your business.

Deploying Windows 10: An overview of what's new and future direction

The process for deploying Windows 10 continues to evolve. In this session, we look at recent improvements that have been introduced, as well as the future direction with new modern deployment techniques.

Deploying Windows 10: User-driven cloud deployment with Windows AutoPilot

Windows 10 introduces new modern deployment capabilities, called Windows AutoPilot, that leverage the cloud to automate the configuration of new PCs without any need for IT to even touch the computer. In this session, learn about Windows AutoPilot and the key cloud-based scenarios that it enables.

Modernize deployment & servicing of Windows 10 & Office ProPlus with Enterprise Mobility + Security

Did you know that you can now replace imaging with a new approach with Windows AutoPilot, Azure Active Directory, and Intune? How about the ability to service Windows 10 and Office 365 ProPlus from the cloud? Join this session to see how you can simplify deployment and servicing in your organization, making your life easier and also lowering TCO.

Group Policy in MDM: Dealing with ADMX backed policies

Windows 10 introduces new management features from MDM. ADMX backed policies introduce Group Policy management. Learn what these policies can do for your systems and how to implement Group Policy from Intune.

Windows devices in Azure Active Directory: Why should I care?
Why should you care about bringing your devices to Azure AD? How about giving your users great productivity experiences while keeping your organizational resources fully protected? Users in Windows 10 will enjoy single-sign-on, consistent settings across devices, bio-metric sign-in to Windows and org. resources with Windows Hello for Business, to name a few. Benefits like Azure AD device- and app-based conditional access and Azure AD Identity Protection will give you the peace of mind you need while enabling productivity in a mobile world. New modern management experiences will enrich your IT experience with devices. Come and learn how to excel as you deploy Windows 10 and manage device identities in your organization.
Use MDM Migration Analysis Tool to accelerate move from group policy to MDM

Do you currently manage your Windows PCs using Group policy, and are you looking to use Mobile Device Management to simplify this? Are you wondering where to start assessing the policies you have configured and which of them are available in MDM today? If so, this session is your dream come true. Come and join us to understand how you can use MDM Migration Analysis Tool (MMAT) to understand what policies you have configured in your environment today and see the extent of MDM availability for these.

Office 365: Users have both a cloud and on premises mailbox.

$
0
0

In Office 365 our provisioning logic generally prevents the existence of a mailbox on premises and in the service.  You can imagine the confusion of cloud messages delivering to a mailbox in the cloud and on premises messages delivering to a mailbox on premises.

 

When a mailbox is provisioned on premises the azure active directory synchronization process is responsible for synchronizing the attributes into Azure AD and Office 365.  One of the attributes synchronized from on premises to Azure AD is the ExchangeGUID.  Here is an example of a test account on premises.

 

[PS] D:>Get-Recipient RecipientTest | fl name,recipienttype,exchangeGuid

Name          : Recipient Test
RecipientType : UserMailbox
ExchangeGuid  : f20047a9-3fd1-4906-8d98-188feacdd5b7

 

When AAD Connect has completed a synchronization cycle the same can be verified on the mail user object created in the service.  Any on premises mailbox should be represented by a mail user object within the service.

 

PS C:> Get-Recipient RecipientTest | fl name,recipienttype,exchangeguid,skuAssigned

Name          : Recipient Test
RecipientType : MailUser
ExchangeGuid  : f20047a9-3fd1-4906-8d98-188feacdd5b7

SKUAssigned   :

 

In most cases the mail user object is originally unlicensed.  The presence of the immutableID value demonstrates a link to an on premises active directory account.

 

PS C:> Get-MsolUser -UserPrincipalName recipienttest@domain.com | fl displayName,userprincipalname,immutablid,isLicensed,Licenses

DisplayName       : Recipient Test
UserPrincipalName : RecipientTest@domain.com
ImmutableId       : XSLPV65jKEqTyWafMqX8LA==
IsLicensed        : False
Licenses          : {}

 

In this example we will assign an Exchange Online license via the portal.

 

image

 

The license assignment can be validated with get-MSOLUser.

 

PS C:> Get-MsolUser -UserPrincipalName recipienttest@DOMAIN.com | fl displayName,userprincipalname,immutablid,isLicensed,Licenses

 

DisplayName       : Recipient Test
UserPrincipalName : RecipientTest@domain.com
ImmutableId       : XSLPV65jKEqTyWafMqX8LA==
IsLicensed        : True
Licenses          : {ORGANIZATION:STANDARDWOFFPACK}

 

The replication of the license status can be validated with get-recipient and reviewing the skuAssigned.  In this case the object continues to be retained as a mail user with a SKUAssigned.

 

PS C:> Get-Recipient RecipientTest | fl name,recipienttype,exchangeguid,skuAssigned

Name          : Recipient Test
RecipientType : MailUser
ExchangeGuid  : f20047a9-3fd1-4906-8d98-188feacdd5b7
SKUAssigned   : True

 

The presence of the ExchangeGUID and license has signified to the provisioning process that a mailbox object should not be provisioned.  This test has concluded as expected.

 

If the presence of an ExchangeGUID + a license signifies that no mailbox should be provisioned how is it possible that a mailbox could be provisioned in both locations?  Let us take a look at an example presented by a customer I recently worked with…

 

In this instance I have created a CLOUD ONLY account.  The account has been provisioned with a UPN that matches an on premises UPN suffix and no licenses.

 

PS C:> Get-MsolUser -UserPrincipalName testDuplicate@DOMAIN.com | fl displayName,userPrincipalName,isLicensed,Licenses

DisplayName       : Test Duplicate
UserPrincipalName : TestDuplicate@DOMAIN.com
IsLicensed        : False
Licenses          : {}

 

When a license is assigned via the portal that contains an Exchange Online option – mailbox object would be provisioned in Exchange Online.  This is the expected behavior – no ExchangeGUID is replicated from on premises.  In addition the immutableID is not populated demonstrating no link to an on premises AD account (therefore a cloud only account).

 

PS C:> Get-MsolUser -UserPrincipalName testDuplicate@domain.com | fl displayName,userPrincipalName,immutableID,isLicensed,Licenses

DisplayName       : Test Duplicate
UserPrincipalName : TestDuplicate@domain.com
ImmutableId       :
IsLicensed        : True
Licenses          : {ORGANIZATION:STANDARDWOFFPACK}

 

PS C:> Get-Recipient TestDuplicate | fl name,recipienttype,exchangeguid,skuAssigned

Name          : TestDuplicate
RecipientType : UserMailbox
ExchangeGuid  : 9e58304a-ed80-416a-8eac-f2e769056e52
SKUAssigned   : True

 

At this point everything looks to be working as expected.  Here is where the issue could come into existence.  The customer I worked with was testing Office 365.  In this instance they took and intentionally mirrored some key on premises accounts prior to having directory synchronization enabled in their tenant.  That is to say that the customer created cloud only accounts for testing that utilized the same on premises user principal name and same proxy addresses as accounts on premises.  Using our example this is the on premises account representation.

 

[PS] D:>Get-Recipient TestDuplicate | fl name,recipienttype,exchangeGuid

Name          : Test Duplicate
RecipientType : UserMailbox
ExchangeGuid  : 064e1a94-3199-49a7-9cb3-3ce3412424d6

 

In this example you will note that the ExchangeGUID of the mailbox on premises does not match the ExchangeGUID of the account within the service.  This is to be expected since there is no directory sync relationship

 

At this time the customer has decided to enabled directory synchronization on the accounts.  In preparation for this the licenses on the accounts were removed.

 

PS C:> Get-MsolUser -UserPrincipalName testDuplicate@domain.com | fl displayName,userPrincipalName,immutableID,
isLicensed,Licenses

DisplayName       : Test Duplicate
UserPrincipalName : TestDuplicate@domain.com
ImmutableId       :
IsLicensed        : False
Licenses          : {}

 

The license change replicates into Exchange Online resulting in the associated mailbox no longer being valid.

 

PS C:> Get-Recipient TestDuplicate | fl name,recipienttype,exchangeguid,skuAssigned
The operation couldn't be performed because object 'TestDuplicate' couldn't be found on
'CO1PR06A002DC01.NAMPR06A002.prod.outlook.com'.
    + CategoryInfo          : NotSpecified: (:) [Get-Recipient], ManagementObjectNotFoundException
    + FullyQualifiedErrorId : [Server=BY1PR0601MB1402,RequestId=5faf2d1c-dc9a-4fd7-8395-db766ca0cf0b,TimeStamp=9/10/20
   17 3:53:46 PM] [FailureCategory=Cmdlet-ManagementObjectNotFoundException] 10E01848,Microsoft.Exchange.Management.R
  ecipientTasks.GetRecipient
    + PSComputerName        : ps.outlook.com

 

It was at this time that a very important and often overlooked attribute was set.  Although the license was removed rendering the Exchange Online mailbox inaccessible the users representation within the Exchange Online Active Directory was not removed.  This can be seen with the get-user command.

 

PS C:> Get-User TestDuplicate

Name          RecipientType
----          -------------
TestDuplicate User

 

An attribute of the user object within the Exchange Online Active Directory is the previous recipient type.  In this case the license removal resulted in this attribute being populated as user mailbox – as the previous recipient type for the linked account was mailbox.

 

PS C:> Get-User TestDuplicate | fl name,recipienttype,previousrecipienttypedetails

Name                         : TestDuplicate
RecipientType                : User
PreviousRecipientTypeDetails : UserMailbox

 

Prior to the existence of previous recipient type details the provisioning process, when handling license additions and removals, would attempt to guess at what the previous recipient type was.  This could lead to mailbox reconnect issues, wrong object provisioning, and other miscellaneous issues when provisioning accounts after license removal and addition.  The previous recipient type details no exists to track the status of the recipient in Exchange Online.  Administrators do not have access to modify the previous recipient display type.

 

In our case the administrator is now proceeding with the process of soft matching cloud only accounts to accounts that exist on premises.   The following is after a directory sync cycle where soft matching occurred.  We can now verify that the immutableID field is stamped indicating the account is linked to an on premises active directory account.

 

PS C:> Get-MsolUser -UserPrincipalName testDuplicate@domain.com | fl displayName,userPrincipalName,immutableID,isLicensed,Licenses

DisplayName       : Test Duplicate
UserPrincipalName : TestDuplicate@domain.com
ImmutableId       : YcQouGWUS0Ww3XyDegtu6g==
IsLicensed        : False
Licenses          : {}

 

With the directory synchronization cycle completed the administrator then assigned licenses to the accounts. 

 

PS C:> Get-MsolUser -UserPrincipalName testDuplicate@domain.com | fl displayName,userPrincipalName,immutableID,
isLicensed,Licenses

DisplayName       : Test Duplicate
UserPrincipalName : TestDuplicate@domain.com
ImmutableId       : YcQouGWUS0Ww3XyDegtu6g==
IsLicensed        : True
Licenses          : {ORGANIZATION:STANDARDWOFFPACK}

 

When the license status replicates to Exchange Online an appropriate recipient object will be provisioned.

 

PS C:> Get-Mailbox TestDuplicate | fl name,recipientType,exchangeGUID,skuAssigned

Name          : Test Duplicate
RecipientType : UserMailbox
ExchangeGuid  : 17d89853-ec3e-4bb3-9bf8-a7d022818fb0
SKUAssigned   : True

 

In this example a recipientType of user mailbox has been provisioned.  This is NOT the expected recipient type.  In this example we should have expected a mail user object to be created.  Why did this occur?  Although the on prmises ExchnageGUID is populated and replicated by directory synchronization the users previous recipient type was user mailbox.  The provisioning process ignores the presence of the replicated ExchangeGUID and creates in this instance a new blank mailbox for the user within Exchange Online since the previous recipient display is stamped.

 

ExchangeGuid  CLOUD : 17d89853-ec3e-4bb3-9bf8-a7d022818fb0

ExchangeGuid  ON PREMISES : 064e1a94-3199-49a7-9cb3-3ce3412424d6

 

If the licenses is removed from the account the object reverts to a mail user object with the matching Exchange GUID on premises.

 

PS C:> Get-Recipient TestDuplicate | fl name,recipienttype,exchangeguid,skuAssigned

Name          : Test Duplicate
RecipientType : MailUser
ExchangeGuid  : 064e1a94-3199-49a7-9cb3-3ce3412424d6

SKUAssigned   : False

 

ExchangeGuid  CLOUD : 064e1a94-3199-49a7-9cb3-3ce3412424d6

ExchangeGuid  ON PREMISES : 064e1a94-3199-49a7-9cb3-3ce3412424d6

 

The combination of previous recipient display type as user mailbox and a license triggers the mailbox provisioning process even if ExchangeGUID is stamped.  This can lead to a condition where an on premises mailbox and Exchange Online mailbox exist at the same time.

Demo SCOM Script Template

$
0
0

 

image

 

I thought I’d take a moment to publish my SCOM script template.

Whenever I am writing a SCOM script for monitoring, discovery, or automation, there are some “standards” that I want in all my scripts.

 

1.  I personally feel that all script running in SCOM should at the MINIMUM log at script starting event, and a script completed event with runtime in seconds.  This helps anyone evaluating the server, or agent, just how many scripts are running and on what kind of frequency.

2.  I like to log “who” the script is executed by (what account, whether RunAs or default agent action account.

3.  I like to have an examples section for manually assigning script variables, which is very handy when testing/troubleshooting.

4.  I assign a ScriptName and EventID variables in the script, for consistency when logging events.

5.  I load examples for discovery scripts, propertybags for monitoring scripts, and just remove what isn't needed.  I find this easier and more consistent than going and grabbing an example from some other script I wrote previously.

6.  I have a section on connecting to the SCOM SDK, for scripts that will run automation on the SCOM management server.  I found this method to be the most reliable, as there are scenarios where commandlets just stop working under the MonitoringHost.exe process.

 

I don’t have a lot of “fluff” in here…. I never like it when I have to page down 3 or 4 pages to get to what a script is actually doing…. this is mostly just the meat and potatoes.

 

#================================================================================= # Describe Script Here # # Author: Kevin Holman # v1.1 #================================================================================= param($SourceId, $ManagedEntityId, $Param1, $Param2) # Manual Testing section - put stuff here for manually testing script - typically parameters: #================================================================================= # $SourceId = '{00000000-0000-0000-0000-000000000000}' # $ManagedEntityId = '{00000000-0000-0000-0000-000000000000}' # $Param1 = "foo" # $Param2 = "bar" #================================================================================= # Constants section - modify stuff here: #================================================================================= # Assign script name variable for use in event logging. # ScriptName should be the same as the ID of the module that the script is contained in $ScriptName = "CompanyID.AppName.Workflow.RuleMonitorDiscoveryDSWA.ps1" $EventID = "1234" #================================================================================= # Starting Script section - All scripts get this #================================================================================= # Gather the start time of the script $StartTime = Get-Date #Set variable to be used in logging events $whoami = whoami # Load MOMScript API $momapi = New-Object -comObject MOM.ScriptAPI #Log script event that we are starting task $momapi.LogScriptEvent($ScriptName,$EventID,0,"`n Script is starting. `n Running as ($whoami).") #================================================================================= # Discovery Script section - Discovery scripts get this #================================================================================= # Load SCOM Discovery module $DiscoveryData = $momapi.CreateDiscoveryData(0, $SourceId, $ManagedEntityId) #================================================================================= # PropertyBag Script section - Monitoring scripts get this #================================================================================= # Load SCOM PropertyBag function $bag = $momapi.CreatePropertyBag() #================================================================================= # Connect to local SCOM Management Group Section - If required #================================================================================= # I have found this to be the most reliable method to load SCOM modules for scripts running on Management Servers # Clear any previous errors $Error.Clear() # Import the OperationsManager module and connect to the management group $SCOMPowerShellKey = "HKLM:SOFTWAREMicrosoftSystem Center Operations Manager12SetupPowershellV2" $SCOMModulePath = Join-Path (Get-ItemProperty $SCOMPowerShellKey).InstallDirectory "OperationsManager" Import-module $SCOMModulePath New-DefaultManagementGroupConnection "localhost" IF ($Error) { $momapi.LogScriptEvent($ScriptName,$EventID,1,"`n FATAL ERROR: Unable to load OperationsManager module or unable to connect to Management Server. `n Terminating script. `n Error is: ($Error).") EXIT } #================================================================================= # Begin MAIN script section #================================================================================= #Put your stuff in here #================================================================================= # End MAIN script section # Discovery Script section - Discovery scripts get this #================================================================================= # Example discovery of a class with properties $instance = $DiscoveryData.CreateClassInstance("$MPElement[Name='Your.Custom.Class']$") $instance.AddProperty("$MPElement[Name='Windows!Microsoft.Windows.Computer']/PrincipalName$", $ComputerName) $instance.AddProperty("$MPElement[Name='Your.Custom.Class']/Property1$", $Param1) $instance.AddProperty("$MPElement[Name='Your.Custom.Class']/Property2$", $Param2) $instance.AddProperty("$MPElement[Name='System!System.Entity']/DisplayName$", $ComputerName) $DiscoveryData.AddInstance($instance) # Return Discovery Items Normally $DiscoveryData # Return Discovery Bag to the command line for testing (does not work from ISE) # $momapi.Return($DiscoveryData) #================================================================================= # PropertyBag Script section - Monitoring scripts get this #================================================================================= # Output a fixed Result = BAD for a monitor example $bag.AddValue("Result","BAD") # Output other data from script into bag $bag.AddValue("Param1",$Param1) $bag.AddValue("Param2",$Param2) # Return all bags $bag #================================================================================= # End of script section #================================================================================= #Log an event for script ending and total execution time. $EndTime = Get-Date $ScriptTime = ($EndTime - $StartTime).TotalSeconds $momapi.LogScriptEvent($ScriptName,$EventID,0,"`n Script Completed. `n Script Runtime: ($ScriptTime) seconds.") #================================================================================= # End of script

 

Do you have stuff you like to place in every script?  If so – let me know in the comments!

2017/5/26開催の「Microsoft Japan Surface Event」 映像公開中【9/11更新】

$
0
0

2017 年 5 26 日に東京で開催した、新しい Surface ファミリーの発表イベント、「Microsoft Japan Surface Event」の映像を公開しています。

マイクロソフト コーポレーションのコーポレート バイスプレジデントで Surface を含むマイクロソフト デバイスの開発部門責任者のパノス パネイが、Surface ProSurface LaptopSurface Studio などの新製品を発表しました。

ぜひ動画をご覧ください!

 

 

マイクロソフトとフェイスブック、AI モデルの相互運用性を確保する仕組み「ONNX」を公開

$
0
0

[2017 年 9 月 7 日]

 

本日、マイクロソフトはフェイスブックと一緒に AI フレームワーク間の相互運用性を保つための共通フォーマット Open Neural Network Exchange (ONNX) フォーマットを発表しました。従来、深層学習モデルを開発するためのフレームワーク (Tensorflow, Caffe, PyTorch, Chainer など) は、ひとつを選択するとそこで作成したモデルは他のフレームワークへの移行ができず、深層学習モデルを開発する際の大きな課題の一つとなっていました。ONNX はこの課題を解決するために考案されました。

ONNX は Cognitive Toolkit (CNTK), Caffe2, PyTorch によってサポートされ、これらのフレームワーク間でのモデル相互運用が可能となります。ONNX はオープンソースプロジェクトとして開発されていくので、開発者コミュニティが今後この仕組みをさらに拡張していく予定です。

 

ONNX 表現とは

今日の AI フレームワークでは、ニューラルネットワークを表現する計算グラフの表現に別々のフォーマットを使っているため、お互いに互換性がありません。ONNX 表現はフレームワーク間に相互運用性を持たせ、開発者がひとつのフレームワークにロックインされ、モデルの変換や移行に取られる手間暇をなくし、よりアジャイルな開発を可能にします。同時に、ハードウェアベンダーは複数のフレームワークの最適化を同時に実施できるので、各々のフレームワークで最適化を行う時間の節約になります。

 

技術概要

ONNX は拡張可能な計算グラフモデルと、作り付けの演算子、標準データ型を提供します。最初は推論に必要な機能が実装されました。各々のデータフローグラフはノードのリストとしての構造を持ち、非環式グラフを形成します。ノードには 1 つ以上の入力と 1 つ以上の出力があります。各ノードは演算子を呼び出します。グラフにはメタデータが付加され、目的、著作者などの情報が記載されます。演算子はグラフの外で実装されますが、作り付けの演算子はフレームワーク間でポータブルです。ONNX をサポートする各フレームワークがこれらの演算子を適用可能なデータ型に対して実装します。

 

ONNXの入手先

https://github.com/onnx/onnx

 

 

この文章は以下の原文を要約したものです:

Article 2

$
0
0

公開早々に、非常に多くのコメントをお寄せくださいましてどうもありがとうございます。予想以上でございましたため一旦コメント機能を無効とさせていただきますが、いただきました貴重なコメントは今後の情報公開に役立てさせていただきます。今後にご期待ください。


Azure Automation: Runbook リンク後のスケジュールが編集出来るようになりました

$
0
0

こんにちは、Azure サポートチームの山口です。

Azure Automation をお使いの皆様へ、Runbook スケジュール機能のアップデートのお知らせです。Runbook をリンクした後のスケジュールの設定を編集出来るようになりました。

更新内容


以前までは、Runbook をスケジュールにリンクした後は、スケジュールを停止するか削除するしかできませんでした。そのため、スケジュールを再設定したい時は、一旦削除して同じ Runbook に対して新たなスケジュールを設定する必要がありましたが、今回のアップデートによりこのような手間を掛ける必要がなくなりました。

スケジュールの編集方法


  1. Azure ポータルにログインして、[Automation アカウント] ブレードから対象の Automation アカウントを選択します。
  2. ハブから [スケジュール] を選び、編集するスケジュールを選択します。
  3. 以下のような画面になるので、スケジュールの再設定を行った後、[保存] ボタンを押すと編集した内容が反映されます。

Cloud Migration Success: Moving Orere School To The Microsoft Cloud

$
0
0

It's always exciting to see our Education Partners delivering awesome solutions to schools that assist them in their digital transformation journey. The blog post today is sharing content from New Era IT showcasing their recent cloud migration of Orere Primary School. There is an excellent video interview with the Principal Kerry Forse that you should watch, in which she quotes:

“The school is now able to use dat a from top end to bottom, to make decisions about what we’re doing. That’s lead to changing our goals across our school, our target students and looking at the types of assessment and learning we’re doing. All of that has come about by actually thinking about how we use the internet and the technology that we have available to us… It’s changed everything.”

The full case study from New Era IT can be seen here and I've reproduced much of it with permission below. What I especially like about this case study is it shows that once you resolve technical issues through the use of Cloud, you open up new possibilities for schools. For Orere Primary, this meant being able to use NZCER and greater data analysis of each student and their results.

As New Era IT’s first primary school to move exclusively to cloud services, Orere Point is now a lighthouse school for the Cloud Transformation Project.

Over the term one break, the rural Auckland school’s ICT services were migrated to Microsoft Office 365 and Azure, and ageing school devices were replaced with new Windows 10 desktops and tablets, specified for classroom environments. After a term in the cloud, Principal Kerry Forse is excited about the progress. “We’ve been on the cloud for 10 weeks, and we’ve not had one issue. There’s been nothing. The internet has been stable, we’ve changed our Student Management System to a better system and we’re now using NZCER for assessment so the children are now doing online assessments.”

So has the cloud made a big difference for the students? “Yep, heaps.” Says Jese, a year 8 student from room one.

Working locally on the old computers was frustrating. “It was very complicated. Before you had to go on the same computer to get that document. So you’d have to work in groups because all the work would be saved on that one computer.” Office 365 has opened up the possibilities. “This is just all in the browser, you can access it from home. I could probably access it from the other side of the world if I wanted to. It’s way easier than before.”

Integrating e-Learning is balanced with a face-to-face approach. “My teacher said he didn’t want us to be like other schools where they don’t interact with the teacher so we try to keep an even balance between ‘computer life’ and interacting with the teacher.”

Targeting students needs is a benefit of the new platform says, Kerry. “Now we can fine tune what teachers are doing to specifically target the needs of the children. The school is now able to use data from top end to bottom, to make decisions about what we’re doing. That’s lead to changing our goals across our school, our target students and looking at the types of assessment and learning we’re doing. All of that has come about by actually thinking about how we use the internet and the technology that we have available to us… It’s changed everything.”

Robotics, programming and further development of e-Learning are all on the horizon, but for now, the sky’s are looking clearer at Orere Point.

To learn more about Cloud Transformation with New Era IT, visit newerait.co.nz/cloud

TechNet Wiki News – Teste o Azure Cosmos DB de forma gratuita

$
0
0

 

Olá leitores do blog Wiki Ninjas Brasil!

Sejam muito bem-vindos a mais um TechNet Wiki News.

A Microsoft anunciou na última semana (sexta, dia 08/09) uma campanha chamada Try Azure #CosmosDB for free.

O Azure Cosmos DB é um serviço NoSQL multi-model que integra o Azure, a plataforma de cloud computing da Microsoft. Contando com suporte a tecnologias como MongoDB, DocumentDB, Azure Tables e Gremlin (grafos), o Cosmos DB se destaca ainda por características como alta disponibilidade, baixa latência, ser facilmente escalável, além de contar com mecanismos de replicação e consistência bastante sofisticados.

A ideia por trás desta iniciativa é promover o uso para testes do Azure Cosmos DB sem custo financeiro algum, com isto acontecendo por um tempo limitado. Para maiores informações e/ou começar a participar da promoção Try Azure #CosmosDB for free acesse o seguinte endereço:

https://azure.microsoft.com/en-us/try/cosmosdb/

Na tela a seguir deverá ser escolhida a API/modelo de dados (DocumentDB/SQL, MongoDB, Table ou Graph/Gremlin):

Será solicitado neste momento que se efetue o login, utilizando para isto uma conta Microsoft (Live ID). Concluído o processo de autenticação aparecerá então um aviso, indicando que os testes estão disponíveis por um período de 24 horas:

 

Ao acionar a opção Open in Azure Portal serão exibidas as diferentes opções disponíveis para gerenciamento da conta criada para uso do Cosmos DB:

 

Na próxima imagem é possível observar as informações para conexão a uma conta baseada no MongoDB:

 

Já na imagem a seguir estão destacados um banco de dados e uma coleção do MongoDB criados por meio da conta para testes:

 

Ficou interessado em conhecer mais sobre o Azure Cosmos DB? Não deixe então de consultar os seguintes links:

Azure DocumentDB: visão geral e primeiros passos

Azure Cosmos DB Documentation

E por hoje é isso pessoal… Até a próxima!

   

Wiki Ninja Renato Groffe (MVP, Wiki, Facebook, LinkedIn, MSDN)

Messaging Records Management & confusion with Litigation Hold

$
0
0

Many customers I speak with are confused by the terminology used with Messaging Records Management (MRM).  Often people assume that if you set a Retention Policy the item will be kept for the number of days specified.  For them this implies the items in question will also be protected for that period of time.  This blog's aim is to review the terminology used by MRM and distinguish what MRM can and cannot do.

To help illustrate the differences I will use an example.

A few months ago I had a customer who thought items were being kept in the mailbox longer than intended.  He had set up a Default MRM Deletion tag to remove items that were more than 3 years old.  He had also turned on Litigation Hold on the mailbox.

Here are the properties from the mailbox:

LitigationHoldEnabled                  : True

SingleItemRecoveryEnabled              : True

RetentionHoldEnabled                   : False

LitigationHoldDate                     : 6/7/2016 9:54:28 AM

LitigationHoldOwner                    : admin@contoso.com

LitigationHoldDuration : Unlimited

RetentionPolicy                        : Default MRM Policy

 

Here are the properties of the 3 year delete retention tag:

WhenChanged                           : 2016-12-05 9:42:54 AM

WhenCreated                           : 2016-06-29 1:39:10 PM

WhenChangedUTC                        : 2016-12-05 2:42:54 PM

WhenCreatedUTC                        : 2016-06-29 5:39:10 PM

 

 

In this case the Litigation Hold was applied before the retention policy. One part of the misunderstanding was that the customer thought the Retention setting from the Default tag took precedence over the Litigation Hold. Unfortunately it is exactly the opposite. In this case the words got in the way. The world outside Microsoft gives a different meaning to retention than the developers of Exchange 2010 and MRM gave to it back when this was designed during development back in 2008.
A few definitions/explanations:

 

Litigation Hold is a blanket that is thrown over the entire mailbox.  When the database engine is asked to permanently remove an item from the Exchange database there is a check to see if the mailbox is protected by Litigation Hold.  If the answer is Yes the item cannot be deleted.  If the answer is No the deletion proceeds and the item is gone forever.  You might say it is a mail item's last chance to avoid destruction.  Litigation Hold ensures an item cannot be permanently removed from the mailbox when it is Younger than the specified age (specified in the LitigationHoldDuration property).  For everything except Calendar Items and Tasks that age is calculated by the date/time at which the item was received.  If there is no Receive Date then the creation date is utilized for the calculation.  Calendar Items and Tasks use the last occurrence as the basis for the calculation instead of the received date.

 

In-Place Hold has the same effect on permanent deletion of items that Litigation Hold does.  However, the scope of an In-Place is different.  Where Litigation Hold applies to EVERYTHING in the mailbox an In-Place hold utilizes a query to protect items.  For example, Lets assume that only Invoice related items are considered important enough to keep for 3 years.  We could create an In-Place hold that checks the Subject and Message body for the word "Invoice".  IF the word "Invoice" is present on the item its permanent removal from the mailbox will be prevented for 3 years.  If the word is absent the item can be removed as if the In-Place Hold was not there.

 

Retention Policies (aka Messaging Records Management (MRM)) are named a little counter intuitively.  They actually do the opposite of retaining items.   MRM specifies how old the item can be before it is deleted or archived.  One way to look at it is to compare it to the expiry date stamped on some grocery items. The expiry date on the items suggest when they should be thrown away. There is nothing to stop you from throwing them away sooner. The date does not guarantee the item will stay in your kitchen until the date specified. You can almost look at MRM as an automated process that cleans out the kitchen's expired items. MRM does nothing to prevent disposal of the item earlier that the expiry date, that is the job of the legal holds described above.

The original name for MRM before Exchange 2007 released was Email Life Cycle (ELC).  You will still see ELC in some of the results from PowerShell cmdlets and Microsoft's internal documentation.  MRM consists of 3 types of tags.  Each has a different scope and order of precedence.  Here is a summary of each:

Personal Tags - These are manually applied to an item by the user by right-clicking the item and assigning the personal tag (often called a policy in Outlook).  Personal tags can be applied to most items inside a mailbox.  Because they are manually applied by a user they take precedence over the other two types of tags.  It is assumed the user knows best when they apply this type of tag.  Personal tags can archive items to the Archive Mailbox or they can order the deletion of an item.

Folder Tags (aka RPT tags) - Folder tags can only be applied to the default folders that appear in all Exchange Mailboxes.  The only action they can carry out is delete.  They are considered to be more important than a default tag.  If a folder tag is created it applies to the folder in Both the primary mailbox and the Archive Mailbox.

Default Tags (aka DPT tags) - These are applied when an item is not subject to either of the previous tags.

 

Recoverable ItemsDeletions - This folder maps to the Recover Deleted Items functionality in Outlook and OWA.  Any item in this folder can be recovered with the standard mail clients.  How long items stay in this folder is governed by the RetainDeletedItemsForproperty of the mailbox (this property is visible when you output the results of get-mailbox).

Recoverable ItemsPurges - This folder houses items that have been deleted for more than RetainDeletedItemsFor days AND that are protected by Litigation Hold.

Recoverable ItemsDiscoveryHolds - This folder houses items that have been deleted for more than RetainDeletedItemsFor days AND that are protected by In-Place Hold.

 

 

The Life cycle of an email

 

For all of these examples we will make three assumptions:

Single Item recovery is enabled for the mailbox (the default in Exchange Online)

The RetainDeletedItemsFor property of the mailbox is set to 14 days with this value:  14.00:00:00

When a Litigation Hold or In-Place Hold is used the duration of this hold is 3 years.  The 3 years is recorded as 1096 days to account for the possibility of a leap year.

 

 

Scenario 1:  A Mailbox with no Retention Policy or Litigation/In-Place Hold.  User Just reads the Item.

  • An item arrives in the mailbox on March 1st 2012 at 15:37:16.714 UTC time.
  • The user reads the item on March 10th 2012 at 02:19:47:917 UTC time and never takes action on the item again.
  • Today this item would still be sitting in the Inbox.

 

 

Scenario 2:  A Mailbox with no Retention Policy or Litigation/In-Place Hold.  User deletes the item a few weeks after reading it.

  • An item arrives in the mailbox on March 1st 2012 at 15:37:16.714 UTC time.
  • The user reads the item on March 10th 2012 at 02:19:47:917 UTC time.
  • The user Shift deletes the item on April 3rd 2012 at 20:05:52.574.  The mail item moves from Inbox to Recoverable ItemsDeletions at the time of the Shift+delete.  Shift+Delete is considered a Hard Delete.  Without SingleItemRetention=$true the item would disappear from the mailbox instantly.  Since SingleItemRetention is the default behaviour for Exchange Online mailboxes the message moves to the Deletions folder.
  • On April 17th, 2012 at 20:05:52.575 the item is no longer protected by the RetainDeletedItemsFor property.  The database engine permanently removes the item within a few minutes.

 

 

Scenario 3a:  A Mailbox with no Retention Policy.  The mailbox has a 3 year Litigation Hold.  User deletes the item a few weeks after reading it.

  • An item arrives in the mailbox on March 1st 2012 at 15:37:16.714 UTC time.
  • The user reads the item on March 10th 2012 at 02:19:47:917 UTC time.
  • The user Shift deletes the item on April 3rd 2012 at 20:05:52.574.  The mail item moves from Inbox to Recoverable ItemsDeletions at the time of the Shift+delete.  Shift+Delete is considered a Hard Delete.
  • On April 17th, 2012 at 20:05:52.575 the item is no longer protected by the RetainDeletedItemsFor property.  The database engine attempts to permanently remove the item within a few minutes.  The removal attempt cannot proceed because of the Litigation Hold.  The server does not want to leave the item where it can be recovered by Outlook or OWA.  Therefore it moves the item to the Purges folder.
  • On March 2nd, 2015 at 15:37:16.715 UTC time the protection of the Litigation Hold ends.  The database engine permanently removes the mail item from the Purges Folder.

 

 

Scenario 3b:  A Mailbox with no Retention Policy.  The mailbox has a 3 year Litigation Hold.  User deletes the item a little more than 5 years after reading it.

  • An item arrives in the mailbox on March 1st 2012 at 15:37:16.714 UTC time.
  • The user reads the item on March 10th 2012 at 02:19:47:917 UTC time.
  • The user Shift deletes the item on April 3rd 2017 at 20:05:52.574.  The mail item moves from Inbox to Recoverable ItemsDeletions at the time of the Shift+delete.  Shift+Delete is considered a Hard Delete.
  • On April 17th, 2017 at 20:05:52.575 the item is no longer protected by the RetainDeletedItemsFor property.  The database engine attempts to permanently remove the item within a few minutes.  The removal attempt succeeds because the Litigation Hold only protected this item for 1096 days from its arrival on March 1st 2012 at 15:37:16.714 UTC time.  Therefore any delete of the item at or after March 2nd, 2015 at 15:37:16.715 UTC time proceeds as if there was no Litigation Hold in place.

 

 

Scenario 4a:  A Mailbox with a 3 year delete Retention Policy.  The mailbox has a 3 year Litigation Hold.  User deletes the item a few weeks after reading it.

  • An item arrives in the mailbox on March 1st 2012 at 15:37:16.714 UTC time.
  • MRM stamps the item with a retention date of March 1st 2015 at 15:37:16.714 UTC time.
  • The user reads the item on March 10th 2012 at 02:19:47:917 UTC time.
  • The user Shift deletes the item on April 3rd 2012 at 20:05:52.574.  The mail item moves from Inbox to Recoverable ItemsDeletions at the time of the Shift+delete.  Shift+Delete is considered a Hard Delete.
  • On April 17th, 2012 at 20:05:52.575 the item is no longer protected by the RetainDeletedItemsFor property.  The database engine attempts to permanently remove the item within a few minutes.  The removal attempt cannot proceed because of the Litigation Hold.  The server does not want to leave the item where it can be recovered by Outlook or OWA.  Therefore it moves the item to the Purges folder.
  • On March 2nd, 2015 at 15:37:16.715 UTC time the protection of the Litigation Hold ends.  The database engine permanently removes the mail item from the Purges Folder.  In this instance the 3 year delete policy has no effect on how the item is handled.

 

 

Scenario 4b:  A Mailbox with no Retention Policy.  The mailbox has a 3 year Litigation Hold.  User forgets the item after reading it.

  • An item arrives in the mailbox on March 1st 2012 at 15:37:16.714 UTC time.
  • MRM stamps the item with a retention date of March 1st 2015 at 15:37:16.714 UTC time.
  • The user reads the item on March 10th 2012 at 02:19:47:917 UTC time.
  • MRM’s ManagedFolderAssistant completes at 15:30 on March 1st, 2015.  The item is not deleted this day because MRM completed a few minutes before the expiry time of 15:37:16.714 UTC.
  • MRM get throttled on each of the next 4 days and terminates for the day without deleting the item.
  • The deletion that moves the item from Inbox to Recoverable ItemsDeletions takes place on March 6th 2015 at 15:29:28.520 UTC time.
  • On March 29th 2015 at 15:29:28.521 UTC time the item is no longer protected RetainDeletedItemsFor.  The database engine deletes it within minutes.  The 3 year litigation hold does not prevent the removal from the Deletions folder because the Litigation hold only protected the item from deletion between March 1st 2012 at 15:37:16.714 UTC and March 2nd, 2015 at 15:37:16.714 UTC

 

 

None of these scenarios cover the item moving through the deleted Items folder.  This is partly because it changes nothing with regard to the operations I am trying to demonstrate and partly because this post is just getting to be too long. 😊

 

Going back to the initial example near the top of this post...
Assume the customer's intention had been to purge all items more than 3 years old in the mailbox, and then begin a litigation hold of unlimited duration. They should apply the MRM policy to the mailbox first. For the retention policy to be effective I would recommend that the tag and policy be created and applied to the mailbox at least two weeks before the Litigation Hold is applied.  Messaging Records Management (MRM) tries to run once per day, but Microsoft only supports one completion per week.    It is a heavily throttled process. MRM is stopped any time the server shows that it is too busy. Terminating MRM removes an extra task may hinder the experience of users connected to the server.  I recommend two weeks instead of one as it often takes two complete executions of MRM to purge items.

 

 

Chris Pollitt

How To Tell If You’re 5 Years Out Of Date On Security Updates

$
0
0

There's a fun indicator you can use to quickly evaluate whether you've been missing security updates for the last five years (ish) on older Operating Systems (i.e. Win2008-2008 R2), and it's the build number. Not infallible, but then not often wrong.

Back In The Day, Build Numbers Were Even More Useful

Very helpfully, the Windows Vista era introduced incremental build numbers for Operating System versions. So when it shipped, Windows Vista - which you'll recall came out almost a year ahead of the server equivalent, Windows Server 2008 - shipped with the build number 6000.

Windows Server 2008 shipped with "Windows Vista" Service Pack 1 inbuilt, as it were, and so Vista and Windows Server 2008 SP1 have the same build number, 6001.

Service Pack 2 followed, again incrementing the build number to 6002.

For the Windows 7 era, things were a bit more straightforward. Windows 7 and Windows Server 2008 R2 shipped at about the same time, as build 7600.

When Service Pack 1 was released for both, the build number incremented to 7601.

Quite a few of our Premier Security Assessments pull OS information using WMI from targets, and I sort by the reported build number to quickly identify groups of hosts which might not have a Service Pack. It's very, very infrequently wrong. You could equally do the same by whether "Service Pack X" appears in the CSDVersion, but the build number is a nice, straightforward way of identifying this if you're collecting it widely.

(AD Computer objects track what appears to be the same information, so querying AD might be a viable option if you're reasonably certain that the computer objects there are still "live").

What can you do with this information?

Well, you can say for sure that anything which self-reports as being build 7600 - i.e. not 7601 - probably hasn't had any Windows security updates since about 2013.

The Support Lifecycle site notes that without SP1, Windows Server 2008 R2 (7600) exited support in April 2013. That's the point after which security updates stop applying, because they require SP1 (7601), which isn't installed.

Likewise, if you've a Windows Server 2008 (6001) Server, it hit End Of Support at the same time (and Service Pack 2 (6002) is required for any updates beyond that point).

If you haven't got the relevant Service Pack approved in WSUS (or SCCM), the computers won't even see updates beyond this point as being applicable. So it might seem like you've a bunch of completely updated and compliant servers, (on closer inspection finding lots of updates aren't applicable to them) but if they haven't taken the Service Pack, they're only as updated as they self-report. And they know the newer updates aren't for them.

In this case, "newer" means "pretty much everything since mid 2013"

What should you do?

So here's what to do: Pull a report of the OS versions reported by servers within your environment. Clients too, if you think it's possible some don't have Win7 SP1.

You could do something like:

  • PS: get-adcomputer -Filter '(OperatingSystemVersion -like "*7600*") -or (OperatingSystemV
    ersion -like "*6001*")' -Properties OperatingSystemVersion,OperatingSystemServicePack | export-csv NoServicePack.csv
    • (Blank NoServicePack.csv = good)
  • WSUS: Turn on the Version column in the Computers view in the WSUS console, then Group By (or just Sort by) Version and look at the build numbers reported.

If there are 7600s or 6001s found, check a few out, and just confirm that they're not relevant-Service-Pack-less. If they are, try to work out and address the root cause - for eg, the Service Pack update wasn't approved, or the WSUS catalog doesn't include the update, or the PC isn't in the right SCCM update group, or... whatever it is.

As a note, if you're in that bucket, you're likely to have many updates to apply, which will likely take some time and disk space to chew through. (If it's simpler to redeploy an OS with a current build than update an older one, consider that).

And

And if you've found some unpatched boxes as a result of reading this, a) phew, lucky we found them now, and b) really think about that root cause. Mistakes are inevitable; does your process allow for mistakes and have any built-in correction for them? Update management isn't always easy, but many update policies are geared towards fragility and failure, due to excessive process being required for an update to make it to the target box. A process failure without a corrective phase might result in updates being missed for years.

In some cases, what we hear is that some set of updates are initially rejected (or "deferred") due to issues or concerns, which is fair enough - but then the decision doesn't get revisited for months or years afterwards - sometimes never, until the update state is compared with Windows Update. If you don't look back and check your assumptions - really test what updates are deployed and what you're still vulnerable to - then things can rapidly and near-invisibly deteriorate, until suddenly, one day you're looking back at 5 years of unpatched systems.

Core question: If the participants in your existing update process/policy had been pointed directly at Windows Update and set to update weekly, how many Critical and Important updates might have been applied in the interim?

 

And And: an afterthought for 2012 R2

I haven't got into 2919355 yet, but it's the 2012 R2 (and Windows 8.1) equivalent of a Service Pack, and as of late 2014, it became the mandatory update on which all other 2012 R2 (and 8.1) updates depended. Unfortunately, I don't think it's a simple build check for that one (though it might be visible though the detailed build reported by  the WSUS console - I don't have one to check right now), but it's the other key update we find missing when evaluating update state.

Viewing all 36188 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>