Quantcast
Channel: TechNet Blogs
Viewing all 36188 articles
Browse latest View live

IoT Project for Anyone – Collecting and Streaming Data with Raspbian Operating System

$
0
0

IoT Project for Anyone Blog Series:

This is one of a series of blogs developed as part of an overall IoT implementation project for beginners. The project will walk through the hardware needed, operating system installation, data collection and a Power BI dashboard to view the data. These blogs can be reviewed in sequence as part of the overall project or on their own if looking for information on a particular IoT topic.

 

Project Blog Index:

IoT for Anyone - Introduction

IoT for Anyone - Materials and Connections

IoT for Anyone - Windows 10 IoT Core Operating System Installation

IoT for Anyone - Raspbian Operating System Installation

IoT for Anyone - Power BI Streaming Dataset Setup

IoT for Anyone - Collecting and Streaming Data from Windows 10 IoT (coming soon)

IoT for Anyone - Collecting and Streaming Data from Raspbian Operating System (this blog)

IoT for Anyone - Designing IoT Microsoft Power BI Dashboard

 

Introduction:

The steps in this blog will configure the Raspberry Pi running the Raspbian operating system to collect the temperature and humidity sensor data from the Adafruit DHT22. The data will then be streamed to Microsoft Azure. These steps are different than those used for the configuration of a Raspberry Pi running Windows 10 IoT Core. For Window 10 IoT Core, review the blog in this series.

 

Several of these commands were originally published in Sirui Sun's blog called "Building a Real-Time IoT Dashboard with Power BI: A Step-by-Step Tutorial." I have added additional instructions for readers who may unfamiliar with the Raspbian operating system.

 

Configure Data Collection:

  1. While logged into the Raspbian operating system, open a shell.

  2. Enter the commands below (all commands below are case sensitive). Enter "Y" at any prompts during the installation of each package.

    sudo apt-get update

    sudo apt-get install build-essential python-dev

    git clone https://github.com/adafruit/Adafruit_Python_DHT.git && cd Adafruit_Python_DHT && sudo python setup.py install

  3. In the Raspbian operating system, open the Chromium web browser application by clicking on the globe icon below.

  4. Go to the following site to download the site https://github.com/Azure-Samples/powerbi-python-iot-client/

  5. Verify you see the "uploadDHT22Data.py" Python script on the screen.

  6. Download the Python script we'll use to collect the data. Click on the" Clone or download" option.

  7. Click on the "Download Zip" option. The zip file containing the script will be downloaded to a default folder.
  8. Open a Bash window and enter the commands below. We'll create a new directory and unzip the Python script there to then work with. Remember, these commands are case sensitive so copy this command exactly as displayed.

    mkdir -p /home/pi/script && unzip /home/pi/Downloads/powerbi-python-iot-client-master.zip -d /home/pi/script

  9. Now, we will edit the Python script to send data to your Power BI Dashboard. Begin by opening the File Explorer by clicking on the two folder icon in the upper left of the screen.

  10. In the File Explorer icon, navigate to where the extracted files are located.

  11. Right mouse click on the "uploadDHT22Data.py" file and select the "Text Editor."

  12. With the file open in the Text Editor, locate the REST_API_URL = " *** Your Push API URL goes here *** "

  13. In this file, copy the Push URL you saved when you setup the Power BI Dashboard to receive a streaming data set in an earlier blog. Keep the quotation marks around the URL you enter. Because this URL is very long, instead of retyping the entire string I saved my URL to a text file on a USB thumb drive on my computer. Then, I inserted the thumb drive into the Raspberry Pi. Using the File Explorer, I located the USB thumb drive, opened my file, copied the contents, and then pasted it into the Python script file. If you are using the USB thumb drive method, remember in the Raspbian operating system that the drive is "mounted." By default, your drive should appear in the /media/pi/(USBDriveName) folder area.

  14. When the file is edited to include your Power BI Push URL, it should look like the file below (I am not going to show my entire file):

  15. You also may want to modify the script to collect the temperature data in Fahrenheit instead of the default of Celsius. You will also want to change the label from C to F in the next line. If so, add the code below in the section shown within the script (case sensitive). Then change the C to F on the next line (highlighted).

    temp = temp*9/5+32 #Convert Celsius to Fahrenheit using temp variable

  16. With the file edit complete, click on "File" and then "Save."

  17. With the Python script to collect data and send it to Power BI, you are now ready to launch the script in the next section.

     

Run the Python Script:

  1. Open a shell

  2. Enter the command below to change to the directory where the Python script is located (remember, these commands are all case sensitive).

    cd /home/pi/script/powerbi-python-iot-client-master

  3. To confirm we are in the correct directory where the script is located, run a directory list command by entering the command below. You should see your "uploadDHT22Data.py" Python script with the file data and time of when you just modified it.

    dir -l

  4. Enter the command below to launch the data collection script (case sensitive):

    python uploadDHTData.py

  5. The temperature and humidity information should now be refreshing every second. A 200 response should be seen after each data polling, indicating the information is successfully uploading to the Power BI Dashboard area.

 

Troubleshooting:

  1. What happens if you do not see data being pushed?
    1. Check the wiring of the Adafruit temperature and humidity sensor. Make sure all wires are attached correctly to the Adafruit breadboard and are snug in place, in the correct slots. Doublecheck the wiring diagram from Adafruit. Also make sure the Adafruit Cobbler connector and ribbon cable all have snug connections. If you find any lose wires that you corrected, you will need to reboot the Raspberry Pi and execute the script again in the step above.
    2. Verify you have connectivity to the Internet
  2. What if there is a script error being thrown?
    1. If you see an error on the screen after executing the Python script, I recommend downloading a new copy of the uploadDHT22Data.py file from the Internet again. Then, edit that file with the Push URL you created and try again.
    2. If you are receiving an error similar to "unknown format code f for object of type str" that indicates the wiring on the Adafruit temperature and humidity sensor is not correct. How do I know this you ask? Because I found one my children playing with the wiring one day causing this error to be thrown. 😉

 

Auto Start Data Collection Script:

Now that you have the data being successfully uploaded to Power BI, before we create our Power BI dashboard in the next blog we want to make sure this script runs on every boot of the Raspberry Pi or other IoT device. The instructions below will setup the sensor collection script to automatically start at boot.

  1. While logged into the Raspbian operating system, open the folder explorer.

  2. In the top navigation area, enter the folder path below. These folders are hidden by default, so you will not be able to browse to them.

  3. On the "autostart" file, right click and select "Text Editor"

  4. At the bottom of the file, enter the text below (assuming you used the directory paths listed earlier in this blog (modify if needed)).

    @/usr/bin/python /home/pi/example.py

  5. Click on "File" and "Save" when the edit above has been added.

 


IoT Project for Anyone – Designing IoT Microsoft Power BI Dashboard

$
0
0

IoT Project for Anyone Blog Series:

This is one of a series of blogs developed as part of an overall IoT implementation project for beginners. The project will walk through the hardware needed, operating system installation, data collection and a Power BI dashboard to view the data. These blogs can be reviewed in sequence as part of the overall project or on their own if looking for information on a particular IoT topic.

 

Project Blog Index:

IoT for Anyone - Introduction

IoT for Anyone - Materials and Connections

IoT for Anyone - Windows 10 IoT Core Operating System Installation

IoT for Anyone - Raspbian Operating System Installation

IoT for Anyone - Power BI Streaming Dataset Setup

IoT for Anyone - Collecting and Streaming Data from Windows 10 IoT (coming soon)

IoT for Anyone - Collecting and Streaming Data from Raspbian Operating System

IoT for Anyone - Designing IoT Microsoft Power BI Dashboard (this blog)

 

Introduction:

Power Business Intelligence (Power BI) is an amazing product that enables those of us who are not data scientists and those of us who are perhaps graphically challenged to do many amazing things with data. The instructions below will walk you through one way to graphically represent the data being streamed. However, think about how you can change it to what you want. Keep in mind that you cannot break the dashboard, so after you get the data initially displayed play with the many settings offered. See what you can come up with!

 

Continuing with the articles in this blog series, we now have temperature and humidity data being collected by our Raspberry Pi running either the Windows 10 IoT Core or Raspbian operating system. So now, how do we graphically display this data in a Power BI dashboard? For some readers of this blog series you may be new to Power BI and looking for more information. Below are several overview references:

 

Several of the steps below were originally published in Sirui Sun's blog called "Building a Real-Time IoT Dashboard with Power BI: A Step-by-Step Tutorial." I have added additional instructions for readers who may unfamiliar with the Power BI interface and provided a few additional ways to display the data.

 

  1. Logon to the Power BI account you used in an earlier blog to create the streaming dataset at powerbi.microsoft.com.

    Expand the workspace on the left side of the screens by clicking on the three lined icon.

  2. Click on the My Workspace area. Because this is a new workspace, no Dashboards should be seen.

  3. Click on the +Create option on the upper right to being the creation of a new PowerBI dashboard. In the drop down window, select Dashboard.

  4. Give your new dashboard a name.

     

  5. With the dashboard now created, we can start to define the display of our data. On the top of the new dashboard, click + Add tile.

  6. Select the Real-Time Data option for Custom Streaming Data.

  7. Click Next.
  8. On the next screen to Add a custom streaming data tile, select the dataset shown and then click Next at the bottom of the screen.

  9. Under Visualization Type, choose "Card." Under the "Fields" area, select temperature.

  10. Provide a Title and Subtitle for the new Tile. I supplied the values below. When done, click "Apply."

    Title: Temperature

    Subtitle: Fahrenheit

  11. The result is the tile below:

  12. Next, we'll create a few live data streaming areas for a more active view of the data being collected. Begin again by clicking +Add tile.

  13. Select the Real-Time Data option for Custom Streaming Data.

  14. On the next screen to Add a custom streaming data tile, select the dataset shown and then click Next at the bottom of the screen.

  15. Under Visualization Type, this time choose the "Line Chart" option. Also choose the Axis, Values and Time options according to the screen options below. Click Next when complete.

    Axis: timestamp

    Values: temperature

    Time window to display: 1 minute

  16. Repeat the steps above starting with step 15. Use the same values except for the Time Window to Display. Instead of using the default of one minute, choose the time frame of 60 minutes as shown in the picture below.
  17. When complete, you will now have three tiles on your screen, each with streaming data from your Raspberry Pi. Using the drag and drop capabilities within the Power BI Dashboard Designer, rearrange the data sets to your preference. Below is my display of the unit running in my climate controlled home office.

  18. Repeat the steps above starting with Step 7, but instead of using Temperature, use the Humidity field for each item. Below is my new view that includes Temperature readings as well as my new Humidity data view.

     

  19. As a test of your live data stream, cup your hands over the sensor and breath onto it. The humidity and temperature should increase rapidly. Below is the impact on my sensor data, displaying the spike.

What SQL maintenance should I perform on my SCOM 2016 databases?

$
0
0

 

image

 

***Note – The products and recommendations have changed over the years, so what applied to previous versions does not really apply today.  Make sure you read the entire article!

 

The SQL instances and databases deployed to support SCOM, generally fall into one of two categories: 

1.  The SQL server is managed by a DBA team within the company, and that teams standard will be applied.

2.  The SCOM team fully owns and supports the SQL servers.

 

Most SQL DBA's will set up some pretty basic default maintenance on all SQL DB's they support.  This often includes, but is not limited to:

  • CHECKDB  (to look for DB errors and report on them)
  • UPDATE STATISTICS  (to boost query performance)
  • REINDEX  (to rebuild the table indexes to boost performance)
  • BACKUP

SQL DBA's might schedule these to run via the SQL Agent to execute nightly, weekly, or some combination of the above depending on DB size and requirements.

 

On the other side of the coin.... in some companies, the SCOM team installs and owns the SQL server.... and they don't do ANY default maintenance to SQL.  Because of this all too common scenario - a focus in SCOM was to have the Ops DB and Datawarehouse DB to be somewhat self-maintaining.... providing a good level of SQL performance whether or not any default maintenance was being done.

 

Operational Database:

Daily jobs that run for the OpsDB:

  • 12:00 AM – Partitioning and Grooming
  • 2:00 AM – Discovery Data Grooming
  • 2:30 AM – Optimize Indexes
  • 4:00 AM – Alert auto-resolution

 

Reindexing is already taking place against the OperationsManager database for some of the tables (but not all, and this is important to understand!).  This is built into the product.  What we need to ensure - is that any default DBA maintenance tasks are not conflicting with our built-in maintenance, and our built-in schedules:

There is a rule in OpsMgr that is targeted at the All Management Servers Resource Pool:

The rule executes the "p_OptimizeIndexes" stored procedure, every day at 2:30AM.

This rule cannot be changed or modified.  Therefore - we need to ensure there is not other SQL maintenance (including backups) running at 2:30AM, or performance could be impacted.

If you want to view the built-in UPDATE STATISTICS and REINDEX jobs history - just run the following queries:

SELECT TableName,
OptimizationStartDateTime,
OptimizationDurationSeconds,
BeforeAvgFragmentationInPercent,
AfterAvgFragmentationInPercent,
OptimizationMethod
FROM DomainTable dt
inner join DomainTableIndexOptimizationHistory dti
on dt.domaintablerowID = dti.domaintableindexrowID
ORDER BY OptimizationStartDateTime DESC

SELECT TableName,
StatisticName,
UpdateStartDateTime,
UpdateDurationSeconds
FROM DomainTable dt
inner join DomainTableStatisticsUpdateHistory dti
on dt.domaintablerowID = dti.domaintablerowID
ORDER BY UpdateStartDateTime DESC

Take note of the update/optimization duration seconds column.  This will show you how long your maintenance is typically running.  In a healthy environment these should not take very long.

In general - we would like the "Scan density" to be high (Above 80%), and the "Logical Scan Fragmentation" to be low (below 30%).  What you might find... is that *some* of the tables are more fragmented than others, because our built-in maintenance does not reindex all tables.  Especially tables like the raw perf, event, and localizedtext tables.

 

This brings us to the new perspectives in SCOM 2016, especially when used with SQL 2016.

 

In SQL 2016, some changes were made to optimize performance, especially when using new storage subsystems that leverage new disks like SSD.  The net effect of these changes, on SCOM databases, is that they will consume much more space in the database, than when using SQL 2014 and previous.  The reason for this is deeply technical, and I will cover this later.  But what you need to understand as a SCOM owner, is that the sizing guidance will not match up with previous versions of SQL, compared to SQL 2016.  This isn't a bad thing, you just need to make some minor changes to counteract this.

SCOM inserts performance and event data into the SCOM database via something called BULK INSERT.  When we bulk insert the data, SCOM is designed to use a fairly small batch size by default.  In SQL 2016, this creates lots of unused reserved space in the database, that does not get reused.  If you review a large table query – you will observe this as “unused” space.

image

Note in the above graphic – the unused space is over 5 TIMES the space used by actual data!

image

If you want to read more about this – my colleague Dirk Brinkmann worked on discovering the root cause of this issue, and has a great deep dive on this:

https://blogs.technet.microsoft.com/germanageability/2017/07/07/possible-increased-unused-disk-space-when-running-scom-2016-on-sql2016/

The SQL server team also recently added a blog post describing the issue in depth:

https://blogs.msdn.microsoft.com/sql_server_team/sql-server-2016-minimal-logging-and-impact-of-the-batchsize-in-bulk-load-operations/

 

 

Do not despair.  In order to clean up the unused space, a simple Index Rebuild or at a minimum Index Reorganize for each table is all that is needed.  HOWEVER – these perf tables are NOT indexed by default!  This was likely done when SCOM was designed, because these are not static tables, they contain transient data in the OpsDB, that is only held for a short amount of time.  The long term data is moved into the Data Warehouse DB, where it is aggregated into hourly and daily tables – and those are indexed via built in maintenance. 

To resolve this, and likely improve performance of SCOM – I recommend that each SCOM customer set up SQL agent jobs, that handles Index maintenance for the entire OpsDB, once a day.  I’d say given the other schedules, a start time between 3:00 AM and 6:00 AM would likely be a good time for this maintenance.  That lets the built in maintenance run, and wont conflict with too much.  You should try and avoid having anything running at 4:00 AM because of the Alert Auto Resolution.  We don’t want any blocking going on for that activity.

There are other performance benefits to reindexing the entire database, as many new visualization tables have been added over time, and these don’t get his by our built in maintenance.

 

A great set of maintenance TSQL scripts for Agent Jobs plan can be found at https://ola.hallengren.com/

Specifically the index maintenance plan at https://ola.hallengren.com/sql-server-index-and-statistics-maintenance.html

This is a well thought out maintenance plan, that analyzes the tables, and chooses to Reindex or Reorganize based on fragmentation thresholds, skipping tables if not needed at all.  The first time you index the entire DB, it may take a long time.  Once you set this up to run daily, it will only be optimizing the daily perf and event tables for the most part, which will be a single table containing one days worth.

After a reindex – I have freed up a TON of space.  Here is the same DB:

image

 

 

 

 

Data Warehouse Database:

The data warehouse DB is also self-maintaining.  This is called out by a rule "Standard Data Warehouse Data Set maintenance rule" which is targeted to the "Standard Data Set" object type.  This stored procedure is called on the data warehouse every 60 seconds.  It performs many, many tasks, of which Index optimization is but one.

image

This SP calls the StandardDatasetOptimize stored procedure, which handles any index operations.

To examine the index and statistics history - run the following query for the Alert, Event, Perf, and State tables:

select basetablename,
optimizationstartdatetime,
optimizationdurationseconds,
beforeavgfragmentationinpercent,
afteravgfragmentationinpercent,
optimizationmethod,
onlinerebuildlastperformeddatetime
from StandardDatasetOptimizationHistory sdoh
inner join StandardDatasetAggregationStorageIndex sdasi
on sdoh.StandardDatasetAggregationStorageIndexRowId = sdasi.StandardDatasetAggregationStorageIndexRowId
inner join StandardDatasetAggregationStorage sdas
on sdasi.StandardDatasetAggregationStorageRowId = sdas.StandardDatasetAggregationStorageRowId
ORDER BY OptimizationStartDateTime DESC

In the data warehouse - we can see that all the necessary tables are being updated and reindexed as needed.  When a table is 10% fragmented - we reorganize.  When it is 30% or more, we rebuild the index.

Since we run our maintenance every 60 seconds, and only execute maintenance when necessary, there is no "set window" where we will run our maintenance jobs.  This means that if a DBA team also sets up a UPDATE STATISTICS or REINDEX job - it can conflict with our jobs and execute concurrently. 

I will caveat the above statement with from findings from the field.  We have some new visualization tables and management type tables that do not get optimized, and this can lead to degraded performance.  An example of that is http://www.theneverendingjourney.com/scom-2012-poor-performance-executing-sdk-microsoft_systemcenter_visualization_library_getaggregatedperformanceseries/   They found that running Update Statistics every hour was beneficial to reducing the CPU consumption of the warehouse.  If you manage a very large SCOM environment, this might be worth investigating.  I have seen many support cases which resulted in a manual run of Update Statistics to resolve an issue with performance.

For the above reasons, I would be careful with any maintenance jobs on the Data Warehouse DB, beyond a CHECKDB and a good backup schedule.   UNLESS – you are going to analyze the data, determine which areas aren't getting index maintenance, or determine how out of date your statistics get.  Then ensure any custom maintenance wont conflict with built-in maintenance.

 

 

Lastly - I'd like to discuss the recovery model of the SQL database.  We default to "simple" for all our DB's.  This should be left alone.... unless you have *very* specific reasons to change this.  Some SQL teams automatically assume all databases should be set to "full" recovery model.  This requires that they back up the transaction logs on a very regular basis, but give the added advantage of restoring up to the time of the last t-log backup.  For OpsMgr, this is of very little value, as the data changing on an hourly basis is of little value compared to the complexity added by moving from simple to full.  Also, changing to full will mean that your transaction logs will only checkpoint once a t-log backup is performed.  What I have seen, is that many companies aren't prepared for the amount of data written to these databases.... and their standard transaction log backups (often hourly) are not frequent enough (or tlogs BIG enough) to keep them from filling.  The only valid reason to change to FULL, in my opinion, is when you are using an advanced replication strategy, like SQL Always On, or log shipping, which requires full recovery model.  When in doubt - keep it simple

P.S....  The Operations Database needs 50% free space at all times.  This is for growth, and for re-index operations to be successful.  This is a general supportability recommendation, but the OpsDB will alert when this falls below 40%. 

For the Data warehouse.... we do not require the same 50% free space.  This would be a tremendous requirement if we had a multiple-terabyte database!

Think of the data warehouse to have 2 stages... a "growth" stage (while it is adding data and not yet grooming much (haven't hit the default 400 days retention) and a "maturity stage" where agent count is steady, MP's are not changing, and the grooming is happening because we are at 400 days retention.  During "growth" we need to watch and maintain free space, and monitor for available disk space.  In "maturity" we only need enough free space to handle our index operations.  when you start talking 1 Terabyte of data.... that means 500GB of free space, which is expensive, and.  If you cannot allocate it.... then just allow auto-grow and monitor the database.... but always plan for it from a volume size perspective.

For transaction log sizing - we don't have any hard rules.  A good rule of thumb for the OpsDB is ~20% to 50% of the database size.... this all depends on your environment.  For the Data warehouse, it depends on how large the warehouse is - but you will probably find steady state to require somewhere around 10% of the warehouse size or less.  When we are doing any additional grooming of an alert/event/perf storm.... or changing grooming from 400 days to 300 days - this will require a LOT more transaction log space - so keep that in mind as your databases grow.

 

 

 

Summary:  (or TL;DR; )

image

1.  Set up a nightly Reindex job on your SCOM Operations Database for best performance and to reduce significant wasted space on disk.

2.  You can do the same for the DW, but be prepared to put in the work to analyze the benefits if you do.  Running a regular (multiple times a day) Update Statistics has proven helpful to some customers.

3.  Keep your DB recovery model in SIMPLE mode, unless you are using AlwaysOn replication.

4.  Ensure you presize your databases and logs so they are not always autogrowing, have plenty of free space as required to be supported.

NetBIOS Domain Name Incorrect in SCOM 2016

$
0
0

I noticed this when we upgraded to SCOM 2016.  After we applied UR3 the issue still persisted.  The NetBIOS Domain Name property of the Windows Computer class was not correct.  You can see this through the UI or by querying the database.

SELECT PrincipalName, DomainDnsName, ForestDnsName, NetbiosDomainName FROM OperationsManager.dbo.MTV_Computer

Correct:

PrincipalName DomainDnsName ForestDnsName NetbiosDomainName
ComputerABC.redmond.corp.microsoft.com redmond.corp.microsoft.com corp.microsoft.com REDMOND

Incorrect:

PrincipalName DomainDnsName ForestDnsName NetbiosDomainName
ComputerABC.redmond.corp.microsoft.com redmond.corp.microsoft.com corp.microsoft.com redmond.corp.microsoft.com

The issue appears here

  • Data Source: Microsoft.SystemCenter.WindowsComputerPropertyDiscovery
  • Discovery: Microsoft.SystemCenter.DiscoverWindowsComputerProperties
  • Management Pack: Microsoft.SystemCenter.Internal

I checked versions 7.0.8433.0 and 6.1.7221.0 (Yes, I keep copies of the SCOM 2007 R2 Management Packs) and they both use a JScript called DiscoverWindowsComputerProperties.js which correctly identifies the NetBIOS Domain Name.

I found that later versions 7.0.8437.0 and 7.0.8437.7 of the management pack had switched to a PowerShell script called DiscoverWindowsComputerProperties.ps1 which incorrectly returns the Domain property of the Win32_ComputerSystem class in WMI as the NetBIOS domain name.

While I am hopeful that this issue will be corrected in future Update Rollup, I have created a management pack that disables the PowerShell discovery and replaces it with the JScript discovery from version 7.0.8433.0 of the management pack.  I am attaching the unsealed management pack to this post which you may use or evaluate if you so choose.  This management pack is not officially supported or endorsed by Microsoft, so use at your own risk.

Link to download management pack: UnsealedMP

[mstep] 2017年8月~9月 おすすめコースご案内 – Office 365 顧客に響く最新提案シナリオと デモ方法 他 【8/4 更新】

$
0
0

mstep

mstep は、マイクロソフト パートナー ネットワークへご参加のパートナー様がご利用いただける本格的なクラスルーム/オンライン トレーニングです。

お申し込みは先着順となり、定員に達し次第締め切らせていただきますので、お早めにお申し込みください。

mstepはMPNパートナー様の受講は無償となっておりますので、ぜひご活用ください。

 

mstepclassroom

 

******************************************************************************************************

8/17()

Office 365 顧客に響く最新提案シナリオと デモ方法

<概要> 

マイクロソフトのどのような製品やサービスを扱うにあたっても Office 365 はベースとなるクラウド サービスとなっています。本コースは、マイクロソフトのクラウド パートナーの皆様が、中堅・中小企業のお客様に対して、Office 365 を販売するベースとなるシナリオをご紹介するものです。お客様のニーズに応える具体的な提案シナリオ例を、デモ を交えて解説します。中堅・中小企業向けのプランである「Office 365 Business Premium」 の価値をお伝えするとともに新しい 「Microsoft Teams」 や Enterprise のプランには含まれない 「Microsoft Bookings」 についてもご紹介します。新しくマイクロソフトのパートナーになった企業様や人事異動等で Office 365 の担当になったなど初心者の営業担当者様やプリセールス SE 様などにもってこいのセミナーです。お客様の心に響く上手な デモ方法やポイントを解説し、デモの設定手順書も配布しますので、Office 365 の デモ経験が少ない方でも半日のご受講だけでスキルアップいただけます。もちろん、経験者の方でも最新情報の入手のためのご受講をご検討ください。

******************************************************************************************************

8/23()

MCP 70-740 受験対策セミナー ~Installation, Storage, and Compute with Windows Server 2016

<概要> 

70-740試験に合格するための試験対策コースです。試験の傾向に沿って問題ベースで解説します。

******************************************************************************************************

8/24()

MCP 70-697 受験対策セミナー ~ Configuring Windows Devices

<概要> 

70-697 試験に合格するためのコースです。問題ベースに試験のポイントを解説します。

******************************************************************************************************

8/29()

MCP 70-346/70-347 受験対策セミナー ~MCSA Office 365 対応2試験~

<概要> 

MCP 試験 70-346 Office 365 ID と要件の管理」 / 70-347 Office 365 サービスの有効化」 の合格に必要な知識を習得する。(Office 365 や各 Online Service の基本的な管理については本コースでは扱いません)   

******************************************************************************************************

9/8()

MCP 70-698 受験対策セミナー ~Installing and Configuring Windows 10

<概要> 

70-698に合格するために必要な知識を問題ベースで解説します。   

****************************************************************************************************** 

9/27()

MCP 70-741 受験対策セミナー ~Networking with Windows Server 2016

<概要> 

MCP 試験 70-741:「Networking with Windows Server 2016」の出題範囲に含まれる Windows Server 2016 のネットワーク 機能を解説し、ポイントとなる問題の解法をご確認いただきます。なお、このコースは Windows Server 2016 のネットワーク 機能を 1 から学習するのではなく、試験対策に特化したカリキュラムとなります。Windows Server 2016 のネットワークの基本知識の習得には「MSU 20741 Networking with Windows Server 2016」のコースをご受講されることをお勧めします。   

******************************************************************************************************

9/29()

MCP 70-742 受験対策セミナー ~Identity with Windows Server 2016

<概要> 

MCP 試験 70-742:「Identity with Windows Server 2016」の出題範囲に含まれる Windows Server 2016 ID管理機能を解説し、ポイントとなる問題の解法をご確認いただきます。なお、このコースは Windows Server 2016 ID管理機能を 1 から学習するのではなく、試験対策に特化したカリキュラムとなります。Windows Server 2016 ID管理の基本知識の習得には「MSU 20742 Identity with Windows Server 2016」のコースをご受講されることをお勧めします。   

******************************************************************************************************

[その他の公開コースはこちら]

mstepclassroom

※ mstep クラスルームは、コースの募集がすでに終了していることもあります。予めご了承ください。

 


msteponline

******************************************************************************************************

[mstep Online 30分シリーズ] クラウド時代のモビリティ ソリューション - Enterprise Mobility + Security - (2017 4 )

<概要>

EMS の製品概要とおさえておくべきポイントをコンパクトに、約30分の解説でまとめたビデオです。EMS について基本的なところから学びたい方、断片的にしか理解していないのでもう一度復習したい方に最適です。(製品情報は20174月時点のものです)

******************************************************************************************************

Windows Server 2016 で作る Microsoft HCI (ハイパーコンバージド インフラストラクチャー) (2017 4 )

<概要>

このコースでは、IT Pro を対象に Windows Server 2016 Datacenter の新しい Hyper-V とストレージ機能を利用した HCI (ハイパー コンバージド インフラ) について、その基礎から、導入手順、運用管理までを一通り解説します。

******************************************************************************************************

Dynamics 365 管理系機能 概説 (2017 5 )

<概要>

本セミナーでは、Dynamics 365 Enterprise エディション Plan 1 及び その Plan 1 に含まれる個別アプリを導入する際に利用できる機能のうち、管理系のものについて解説致します。

******************************************************************************************************

ITPro のための Microsoft Azure 仮想マシン 応用 (2017 6 )

<概要>

本コースでは、IT 管理者向けに Microsoft Azure 仮想マシンに関するの高度な利用を短時間で紹介します。主に「ITPro のための Microsoft Azure 仮想マシン 基礎」でカバーしていない新しい機能やサービスを取り扱います。

******************************************************************************************************

Microsoft Azure ネットワーク構築 応用 (2017 6 )

<概要>

本コースでは、IT 管理者向けに Microsoft Azure ネットワークの高度な利用から構築に必要な実践的なテクニックまでを紹介します。受講者はこのコースを参加するだけで、短時間で Microsoft Azure ネットワークの全体像を把握することができ、ネットワークを構築することができるようになります。

******************************************************************************************************

 

[その他のオンラインコースはこちら]

msteponline

 

 

Imagine Cup 2017 世界大会 閉幕

$
0
0

English follows

 

Posted by: ドリュー ロビンズ
業務執行役員 コマーシャルソフトウェアエンジニアリング本部長

皆さんこんにちは。日本でコマーシャルソフトウェアエンジニアリングを担当しているドリュー ロビンズです。

今年15周年を迎えた世界最大の学生向けテクノロジーコンテスト「Imagine Cup 2017」が7月24日から2日間、米国・シアトルにて開催されました。世界各地で選抜された39か国 53チームによるトーナメントを見事勝ち抜いたのは、チェコスロバキア共和国の X.GLUチームです。X.GLUチームは糖尿病の子供たちのための血糖測定システムを開発、審査員や多くの参加者に感銘を与え、今後プロジェクトを進めるための賞金10万ドルとMicrosoft Azureの助成金125,000ドルを授与されました。また同チームは、マイクロソフトのCEO Satya Nadellaとの1対1のメンタリングセッションと、Build 2018デベロッパーカンファレンスへのチケットを獲得しました。

日本からは、入力音声を任意の人の声に変換するアプリでエントリーした東京大学大学院 NeuroVoiceチームと 視覚障がい者向けスマート白杖でエントリーした東京工業大学TITAMAS チームが世界大会に挑みました。両チームとも世界大会への出場が決まってから約3か月間、学業の傍らで、デバイスやソフトウェア、そして英語プレゼンテーションの改修に励まれました。そして大会本番、英語を母国語としない学生にとっては厳しい環境でしたが、パッション溢れる素晴らしいプレゼンテーションを披露、審査員だけでなく他国の参加チームやメディアを魅了していました。

TITAMASチームはクォーターファイナルまで進出しましたがセミファイナルには届かず、その後、両チームとも敗者復活戦に挑みますが残念ながら敗退してしまいました。今回の審査基準である「技術」、「革新性」、「コンセプト」において日本のチームは決して他国に引けを取っていませんでした。今回ファイナルに残った4チームと比較して足りなかった点があるとすれば、4つ目の審査基準である「実現性」、つまりビジネスモデルが成立しているかという点ではないかと考えています。審査基準の全体からすると小さな割合ですが、世界トップクラスの学生が終結する本大会ではこの差が勝敗を左右するといっても過言ではありません。この世界大会に出場し最後まで健闘した、NeuroVoiceチームと TITAMASチームに心から敬意を表します。

学生の積極的なチャレンジ精神と成長、そしてテクノロジーへのパッションこそが、Imagine Cupの意義であり、より良い未来の象徴だと確信しています。マイクロソフトは Imagine Cupを始めとする学生支援を積極的におこなってまいります。そして、我こそは、と思われる学生のみなさん、ぜひ来年のImagine Cupにチャレンジしてください。そしてテクノロジーの力でより良い未来を創造してください!ご応募をお待ちしています。

Imagine Cupの詳細については、https://www.imaginecup.com/(英語) をご覧ください。またマイクロソフトニュースセンターのImagine Cupマイクロサイトでも資料とリソースを入手いただけます。https://news.microsoft.com/imaginecup2017/(英語)

---

Posted by: Drew Robbins
Executive Officer, Commercial Software Engineering Lead

Hello everyone. I’m Drew Robbins, the group lead of the Commercial Software Engineering in Japan.

The world's largest student technology competition "Imagine Cup 2017" celebrating the 15th anniversary of this year was held in Seattle, USA for 2 days from July 24th. Team X.GLU from Czech & Slovakia was crowned the champion of the competition, beating 53 other teams from 39 countries to the top prize. Team X.GLU impressed the judges with their invention – a blood glucose meter prototype for diabetic children – and walked home with US$100,000 in prize money and US$125,000 in Microsoft Azure grants to further their innovative project.

The first-place team also won a 1-to-1 mentoring session with Microsoft CEO Satya Nadella, as well as a trip to the Build 2018 developer conference.

I’m excited for our teams who came from Japan, Team NeuroVoice, entering the application that converts an input voice into another person’s voice, from the University of Tokyo graduate school and Team TITAMAS, entering a smart white cane device for visually impaired, from Tokyo Institute of Technology TITAMAS who were challengers at the world championship. After both teams were selected as the World Finalists, they worked very hard to improve their devices/software and presentations. They worked for nearly three months on top of their regular school work. Although it was a tough environment for English as Foreign Language students, they showed off a wonderful presentation, full of passion which attracted team members and media from other countries as well as the judges.

Team TITAMAS advanced all the way to the quarterfinal and both teams later challenged the “Wild Card” round. The Japanese teams were strong in "Technology", "Innovation" and "Concept" which were the judging criteria of this competition. The one area the Japan teams struggled compared to the finalist teams was in “feasibility”, that judges whether the business model is established. Although it is a small proportion from the whole judging criteria, it is no exaggeration to say that this difference will affect winning or losing in this competition where world-class students participate. I sincerely express my respect to Team NeuroVoice and Team TITAMAS who participated in this world competition and did well to the end.

The challenging spirit and passion for technology of students is the meaning of Imagine Cup, and these students are symbolic of our hope for a better future built on technology. We’re confident that they will go on to achieve more with this experience. Students, we are waiting for your challenge next year!

---

本ブログのすべての内容は、作成日時点でのものであり、予告なく変更される場合があります。正式な社内承認や各社との契約締結が必要な場合は、それまでは確定されるものではありません。また、様々な事由・背景により、一部または全部が変更、キャンセル、実現困難となる場合があります。予めご了承下さい。

共有の連絡先から宛先を指定し msg ファイルや oft ファイルとして保存したものを開いて送信すると失敗する

$
0
0

日本マイクロソフト Outlook サポート チームです。
本ブログでは、共有の連絡先から宛先を指定し保存したものを開いて送信すると失敗する現象について説明します。

現象:
共有の連絡先から宛先を指定し、端末内に msg ファイルや oft ファイルなどに保存後、ファイルを開いて送信しようとすると送信に失敗することがあります。

現象発生条件:
他人から共有された連絡先から宛先を選択したメールについて、送信時に共有の連絡先を開いていない場合に発生します。
同様に、PST ファイルの連絡先から宛先を選択し、送信時にその PST ファイルを開いていない場合にも発生します。

発生理由:
連絡先から登録された宛先については、連絡先のエントリー ID のみが登録されています。
このとき、名前解決は行われているとみなされ、改めての名前解決処理などは行われません。
送信時には保持している連絡先のエントリー ID をもとに情報を取得しますが、その際、共有の連絡先や PST ファイル内の連絡先など、自分のメール ボックス内の連絡先フォルダー以外から選択した場合、その連絡先フォルダーへ接続していることが必要となります。
接続していない場合、宛先の取得が行えないため、送信が失敗します。

対処方法:
指定した連絡先を含む、共有の連絡先や PST ファイルなどの連絡先フォルダーを開きます。

本情報の内容 (添付文書、リンク先などを含む) は、作成日時点でのものであり、予告なく変更される場合があります。

Klasseværelset i Microsoft Teams

$
0
0

Som vi tidligere har skrevet og som i nok har lagt mærke til andre steder, så lukker vi for Microsoft Classroom, fordi vi har lavet en meget bedre oplevelse i Microsoft Teams. Vi tænkte derfor det ville være en idé lige at lave en dansk oversigt over de nye ting vi er kommet med i Microsoft Teams. Så her kommer den! Den blev lidt længere end først ventet, men vi håber den kan give noget klarhed

De nye teams

Vi har introduceret fire former for Teams som man kan oprette:

Disse nye Teams opretter forskellige former for notesbøger og giver forskellige muligheder for administration for lærere og elever:

Hold: Dette er jeres klasseteam eller til projekter på tværs af klasser. Dette er Teamet hvor det er læreren der er administrator for en gruppe elever.

PLC'er: Til lærernes faggrupper og andre læringsfælleskaber. Det er her at lærerne imellem kan dele viden og erfaringer.

Medarbejdere: Skal ledelsen have en samlet måde at kommunikere og samarbejde med alle medarbejdere på skolen, så er dette teamet der skal bruges.

Alle: Det her er den klassiske form for Team, hvor alle kan tilføjes og alle kan deltage.

Classnotebook / Staffnotebook

Mange af jer kender nok allerede Classnotebook, som er vores OneNote til undervisning og opgaver. Den kan man læse meget mere om her. Classnotebook/Staffnotebook er blevet flyttet ind i Microsoft Teams, så afhængig af hvilken form for Team du opretter, så bliver der oprettet en notesbog med dertilhørende funktioner, som individuelle faner, bedømmelse og distributionsmuligheder.

Assignments

Som noget nyt har vi introduceret en "end-to-end" opgaveafleveringsmulighed, med dertilhørende bedømmelse! Det er altså blevet muligt at oprette en opgave til eleverne, for dem at aflevere den i Teams, og derefter kan man som lærer gå ind for bedømme den.

Sammenhængende chat

Med Microsoft Teams får man en sammenhængende chat, som centrum for kommunikation i klassen. Med en sammenhængende chat, mister man aldrig spørgsmål der bliver stillet, kan tagge hinanden og skabe mere samarbejde og videndeling på tværs af klassen i det virtuelle rum.

Tabs og Apps

I hjertet af Microsoft Teams, ligger Tabs og Apps. Med de både førsteparts og tredjeparts Apps der bliver leveret, får I mulighed for at udvide jeres Team til at have de funktioner som er vigtige for jer! Skal i indlejre en YouTube side som en Tab, bruger i Kahoot til repetitionsquiz eller skal i lave afstemninger med Polly, så er de integreret i Teams. Der kommer hele tiden flere og flere uddannelsesspecifikke integrationer til Teams, der gør det til det perfekte værktøj til jeres undervisning.

Kom i gang!

Venter du bare på at komme i gang? Heldigvis kan jeg sige at det er helt gratis at bruge med jeres Office 365 Education, så hvis din skole har Office 365 Education, så skal du bare gå ind og finde Microsoft Teams og oprette dine Teams.

FAQ

Q: Jeg ved ikke om min skole har Office 365 Education?

A: Spørg din it-administrator.

Q: Min skole har ikke Office 365 Education, hvordan får vi det og koster det noget?

A: Office 365 Education er helt gratis for uddannelsesinstitutioner. Kontakt din Microsoft partner for at høre mere.

Q: Min skole har Office 365 Education, men jeg kan ikke se eller oprette Microsoft Teams?

A: Din it-administrator skal højst sandsynligt ind og aktivere det først.

Q: Hvor kan jeg læse mere om det her?

A: Lige her: https://educationblog.microsoft.com/2017/06/collaborative-classroom-features-now-available-microsoft-teams/

Hvis i har har flere spørgsmål skal i være velkomne til at kontakte mig på t-krthor@microsoft.com


[EMS]Azure AD 各種コネクタのポート条件の緩和

$
0
0

いつも Device & Mobility Team Blog をご覧いただきありがとうございます。EMS 担当の鈴木です。
ここ数か月でAzure AD Connectを中心に各種コネクタのポート要件が緩和されています。

これでファイアーウオールの余計なポートを解放せずに各種コネクタを設置することができ、導入のハードルが下がります。
ここでは現状(2017年8月5日)段階での各種コネクタのポート要件をおさらいします。
参考資料
ハイブリッド ID で必要なポートとプロトコル
https://docs.microsoft.com/ja-jp/azure/active-directory/connect/active-directory-aadconnect-ports
アプリケーション プロキシの概要とコネクタのインストール
https://docs.microsoft.com/ja-jp/azure/active-directory/active-directory-application-proxy-enable

①Azure Active Directory Connect
Azure AD Connectのポート要件は80および443のみになりました。
②Azure AD Application Proxy Connecter
Azure AD Application Proxy Connecterのポート要件は80および443の送信トラフィックのみになりました。
バージョン1.5.132.0以降のコネクタが必要です。
③Azure AD パススルー認証
Azure AD パススルー認証にはAzure AD Connectを利用します。
Azure AD Connectのポート要件緩和によりアッパーポートを開けることなく導入が可能になります。
④Azure AD Connect Health
こちらはAzure AD Connectに一部含まれているモジュールですが、ポート443の他にAzure Survice Busに接続するための5671番のポートの許可が必要です。

上記のようにポート要件が大幅に緩和されたことにより導入のハードルが下がり、管理もやりやすくなったのではないかと思います。
古いバージョンを利用されていて、アップデートをかけていない方は見直しも含めて、どこかで入れ替えと環境のメンテナンスをすることをお勧めします。

Scripting Tips & Tricks: Review Your Comments

$
0
0

I wrote a PowerShell module recently as part of an update to a service we provide to customers. I needed to dump out all of the comments so they could be submitted for review. I wanted to ignore comment based help sections.

 

Here's how I did it with the aid of RegEx...

 


$a = gc "X:DevModulesBOOMBOOM.psm1"

$a | select-string -Pattern "^s*#[^#>]"

 

 

The same technique could be used to check another module, a lengthy script or a detailed advanced function. Have fun!

 

Kalenderwoche 31 im Rückblick: Zehn interessante Links für IT-Experten

$
0
0

3 Gründe, warum Microsoft Stream die richtige Videoplattform für interne Unternehmenskommunikation ist

$
0
0

Bei Microsoft Deutschland nutzen wir stetig mehr das Instrument Video für unsere interne Unternehmenskommunikation, sei es zum Beispiel für ein kurzes Statement zu aktuellen Themen unserer Deutschland-Chefin oder unser eigenes Nachrichtenformat (genannt Microsoft Business TV), indem wir Kommunikationsinhalte leicht konsumierbar aufbereiten. Um die Videoinhalte schnellstmöglich und in guter Qualität intern verbreiten zu können, nutzen wir die Videoplattform Microsoft Stream.

Warum ist Microsoft Stream gut für die interne Kommunikation geeignet?

1. Audiotranskription, Gesichtserkennung und Untertitel

Dank Audiotranskription und Gesichtserkennung ist das Auffinden relevanter Inhalte spielend einfach. Das gilt sogar für einzelne gesprochene Wörter oder Personen auf dem Bildschirm – ob in einem oder aber in allen Videos in Ihrem Unternehmen. Microsoft Stream bietet außerdem intelligente Features, wie z.B. die Untertitelfunktion (bisher leider nur für Englisch und Spanisch verfügbar) für die Barrierefreiheit, sodass sich alle Mitarbeiter entsprechend ihren Anforderungen einbringen können.

 

2. Microsoft Stream + Office 365 = Dreamteam

Einfaches Hochladen, Anzeigen und Teilen von Videos direkt in den Kollaborations-Apps, die in den eigenen Unternehmen am meisten genutzt werden, z.B. Microsoft Teams, SharePoint und Yammer – und man muss sich nur einmal anmelden. Bei uns ist es z.B. die Verknüpfung zwischen Microsoft Stream und Yammer: Video in Stream hochladen, Video in unserer Microsoft Deutschland Yammer Gruppe per Link teilen und schon ist die perfekte Einbindung in kürzester Zeit gewährleistet. Wenn ich ein Video mit SharePoint verknüpfen will, mach ich dies mit dem Embed-Code.

3.     Einfacher Zugriff und hohe Sicherheit

Alle Mitarbeiter und Mitarbeiterinnen in den Unternehmen können Inhalte auf beliebigen Geräten und von überall aus freigeben – und das in einer sicheren Unternehmensumgebung. Dank einfacher Steuerelemente können Mitarbeiter entscheiden, für wen sie etwas freigeben möchten, z. B. für die gesamte Organisation oder nur für einzelne Benutzer oder Gruppen, hier ist Azure Active Directory die Maschinerie im Hintergrund.

Wer weitere Fragen zu Microsoft Stream und zur Videonutzung bei uns in der internen Kommunikation bei Microsoft Deutschland hat, kann hier gerne kommentieren oder mich auf Twitter @FrauBiBau fragen. Zur Beantwortung der am häufigsten gestellten Fragen rund um Microsoft Stream geht’s hier entlang.


Ein Beitrag von Bianca Bauer
Internal Communications Manager and Video Specialist

Bianca Bauer treibt als Internal Communications Manager & Video Specialist die Interne Kommunikation bei Microsoft Deutschland im digitalen Dreiklang der Kanäle Yammer, Newsletter und Intranet. In ihrer Rolle als Video Specialist ist sie für neue Videoformate- und Strategien im hauseigenen TV-Studio oder auf Events verantwortlich.

善用 Office 365 發揮聰明管理的好效率

$
0
0

美食界指標的「米其林指南」去年首度進軍中國,推出上海版米其林指南。來自台灣乾杯集團旗下的燒肉餐廳「老乾杯」上海店,榮獲米其林一星的肯定,讓乾杯集團經營中國市場更具信心。

台灣內需市場不大,在僧多粥少的情況下,餐飲業者紛紛將經營眼光瞄準海外市場,希望開發未來市場的成長性。從經營燒肉居酒屋起家的乾杯集團,目前除了持續經營燒肉居酒屋品牌外,也發展日式風格的火鍋品牌,以及從日本引進代理餐飲品牌,目前門市營業據點達四十家,並瞄準海外市場, 進駐中國上海展店。

由於中國政府對網路系統進行嚴格控管,因此乾杯派駐台灣幹部至中國拓點時,對於選擇方便用在公司管理階層溝通的網路軟體,曾經進行一番評估。

 

【適用中國市場 操作便利順暢】

 

「我們考量最大的重點是哪一種數位系統平台可以在中國被應用,溝通聯繫各方面要做到順暢、即時、有效率,而且是可被信賴、能穩定持續提供服務、資訊安全性高。」乾杯集團執行長室經理廖佳怡強調。

乾杯集團綜合以上考慮的要件,最後選擇使用微軟 Office 365。廖佳怡指出,乾杯集團經營區域包括台灣北、中、南部以及中國上海,各地的品牌經理、區經理、門市店長距離總公司較遠,當總公司與他們進行店鋪管理層面的策略溝通、資訊傳遞、 情報交換,就非常需要 Office 365 的協助。

她進一步表示,乾杯的店長、區經理在外部移動性較強,因此接受總部訊息會有產生一定的落差, 過去在收發電子信件也有不便之處,如今總部導入 Office 365 的數位服務後,同仁可以在電子郵件往來上順利透過隨身攜帶的手機、平板等行動裝置, 順利達成溝通。

Office 365 的電子郵件是乾杯集團管理階層最常使用的軟體,這些管理人員在手機、平板等行動裝置上裝設 Office 365 電子郵件的 APP 後,只要輸入帳號與密碼,就能馬上收發信件,不必再經歷以往必須進行多層設定的麻煩事。

像廖佳怡有固定轉寄信件的需求,她原本都在桌上型電腦的 Outlook 進行設定,但遇上休假或未打電腦使用 Outlook 時,這些轉寄信件則無法被發送出去,而 Office 365 則讓她不必特地打開電腦進行轉寄的操作。「當我的電子郵件帳戶收到來信時, 可以自動做到轉發。因此我休假時,這樣的功能仍然可以運作。」她說,Office 365 簡單的操作介面對於使用者來講,是比較容易上手的。

乾杯導入 Office 365 的過程極為順利,集團人員普遍年輕化,大家很習慣在手機安裝 APP,因此對於Office 365裝載進入手機介面,完全沒問題。「只要登入帳號與密碼就完成了!」廖佳怡指出,過去在手機做收發電子郵件必須進行很多層的設定,因此 Office 365 簡單的使用介面對使用者來講是比較容易上手的。

此外,同仁從 Outlook 轉換到 Office 365 使用電 子郵件的同時,也逐漸發現 Office 365 其他軟體的方便性,廖佳怡就十分推崇 Office 365 建立會議室 與預約小組會議的功能。

「我們總部辦公室分處三處不同的建築大樓, 彼此之間有聯絡不便之處。」廖佳怡說,以前要借會議室開會要做紙本登記,臨時要借會議開會,還得打電話麻煩在會議室附近辦公的同仁幫忙查看登記本,現在有了 Office 365 後,同仁可以在 Office 365 上面馬上確認哪一間會議室未被使用,不僅為同仁簡化很多預約會議室的流程,也達到省時的效益。

 

【跨域、跨部門、跨辦公室溝通無礙】

 

乾杯同仁們十分常用 Office 365 的 Skype for Business,召開跨地域、跨部門、跨辦公室的即時線上會議,與中國幹部、日本品牌夥伴展開面對面的線上討論。

服務業工作內容極大比重在於溝通,廖佳怡強調,Office 365的雲端服務就可以幫乾杯解決跨域、 跨部門、跨辦公室的溝通問題。她說,乾杯未來在中國將會持續拓點,派駐更多台灣幹部駐點當地城市,由於中國幅員遼闊,乾杯總部要將分散不同城市據點的幹部聚集開會,必須支出龐大的時間、交通成本,因此乾杯期望 Office 365 能繼續為集團創造溝通無遠弗屆的良好效益。

年輕世代偏好數位服務體驗,因此餐飲數位化管理與服務,在中國如火如荼地發展。乾杯在中國市場也正在學習數位服務的創新,引進台灣應用。

廖佳怡指出,中國餐飲業應用數位設備的速度走得很快,乾杯的幹部也在觀察當地普遍使用的支付方式,未來乾杯也將在設於台灣的百貨商場的店舖中提供支付服務,如果操作順利的話,則將進一步推廣到路面店舖。

目前乾杯總部使用的 IT 人力相當精簡,所以內部使用的軟體盡量導入外部現成的資源。由於 Office 365 為乾杯 IT 人員擔下不少工作量,IT 人員現在有餘裕可以針對各部門對資訊系統的不同需求,展開量身訂做的研發,運用數位力提升乾杯的營運力。

 

 

 

 

 

 

 

 

 

 

—乾杯集團執行長室經理廖佳怡

Office での LaTeX の数式入力

$
0
0

(この記事は 2017 7 30 日に Murray Sargent: Math in Office に投稿された記事 LaTeX Math in Office の翻訳です。最新情報については、翻訳元の記事をご参照ください。)

Word の数式入力モードを UnicodeMath (英語) から LaTeX (英語) に切り替えられるようになりました。以前からご要望の多かったこの機能は、未完成な部分が残っていたため、今まで大々的に告知していませんでした。原型となる変換ルーチンが作られたのは 2007 年秋のことです。当時は PowerPoint で物理のプレゼンテーションを準備する際、スライドに Wikipedia ページの数式をコピーするのに重宝していたものの、他のさまざまなプロジェクトが同時進行していたこともあり、この機能を製品としてリリースするためのテストが遅れてしまいました。ついに 8 月より、LaTeX の数式を Office 365 で使用できるようになります。「Word の UnicodeMath と LaTeX を使用した行形式の数式 (英語)」の記事で紹介されているとおり、Word の新しい [Math] リボンには、すぐわかるように [LaTeX] オプションが表示されます。以下は、Word の [Math] リボンの左端の画像です。この例では、現在の入力形式として [LaTeX] が選択されています。

Word の LaTeX モードでは、数式の自動変換 (英語) は現在改良中のため無効となっています。今回、Word には 2 つの新しいホット キーが追加されました。1 つ目は数式エリアの組み立て (Ctrl + =) で、2 つ目は分解 (Shift + Ctrl + =) です。数式エリアを挿入または削除するホット キー (Alt + =) も使用できます。

PowerPoint と OneNote で LaTeX を使用する

現時点で、OneNote と PowerPoint で LaTeX を有効化する方法は少々わかりにくく、入力モードを変更するには数式オートコレクトで新しい制御文字を定義しておく必要があります。Alt + = を押して数式エリアを挿入し、[Math] リボンの [Tools] セクションの右下をクリックすると、[Equation Options] ダイアログが表示されます。[Math Autocorrect] をクリックして、[Replace] テキスト ボックスに「TeX」と入力し、[With] テキスト ボックスに「24C9」と入力してから Alt + X を押すと「Ⓣ」を入力できます。それ以降は、数式エリアに「TeX<スペース>」と入力すると、OneNote や PowerPoint の入力形式を UnicodeMath から LaTeX に切り替えることができます。必要に応じて、入力モードを UnicodeMath (行形式、英語) に戻す制御文字も定義できます。[Replace] テキスト ボックスに「LF」と入力し、[With] テキスト ボックスに「24C1」と入力してから Alt + X を押すと「Ⓛ」を入力できます。PowerPoint と OneNote では、数式の自動変換は既定で有効であるため、すぐにお試しいただけます。

数式オートコレクト

integral という数式オートコレクトの制御文字を使用すると、数式のモードが固定されます。

この数式は、以下の UnicodeMath テキストを出力したものです。

1/2π ∫_0^2π▒θ/(a+b sin θ)=1/√(a^2-b^2)

LaTeX 入力が有効になっていると、このテキストは適切に出力されません。PowerPoint と OneNote の場合、LaTeX の表記を使用して独自の数式オートコレクトの制御文字を定義することで、LaTeX が有効な場合にも適切に出力することができます。LaTeX では、上記の数式は以下のように表記します。

frac{1}{2pi}int_{0}^{2pi}frac{dtheta}{a+bsin{theta}}=frac{1}{sqrt{a^2-b^2}}

制御文字によって ASCII 範囲外の Unicode 文字を含むテキストが挿入される場合、UnicodeMath モードに自動的に切り替わるときに問題が生じる可能性があります。しかし、XeTeX やその他の Unicode 対応の TeX 言語に切り替えれば、この単純なヒューリスティックは適用されません。実際に、Unicode TeX は Office 数式アプリで適切に表すことができます。たとえば、以下のように表記します。

frac1{2π}∫_0^{2π}frac{dθ}{a+bsin{θ}}=frac1{√{a^2-b^2}}

これは、上記の純粋な ASCII の表記よりも読みやすくなります。Unicode 文字の多くは、[Math] リボンのギャラリーから挿入できます。

制御文字

LaTeX オプションでは、UnicodeMath 仕様 (英語) の付録 B に掲載されている数学演算子、ギリシャ文字、その他の記号などの TeX 制御文字がすべてサポートされています。begin{equation} や begin{matrix} といった冗長な LaTeX の表記は対象外となりますが、代わりに matrix{...} や pmatrix{...} といった簡潔な TeX の表記がサポートされています。分数は、LaTeX 形式の frac{...}{...} でも TeX 形式の {…over...} でも入力できます。displaymath は、数式エリア内で改行を構成する場合に使用するもので、現時点ではインラインの数式エリアでは適用されません。Unicode の数学用英数字 (英語) は、mathbf{} などの制御文字を使用して入力できます。UnicodeMath では、[ホーム] リボンの [太字] および [斜体] ボタンを使用して、数式の太字や斜体を切り替える (英語) ことができます。scriptX、doubleX、frakturX を使用すると、それぞれ筆記体、黒板文字 (重ね打ち体)、フラクトゥール文字を入力できます。たとえば、scriptS でも mathcal{S} でも結果は同じです。「Word の UnicodeMath と LaTeX を使用した行形式の数式 (英語)」にて、サポートされている制御文字の一覧を確認できます。

今後も、さらに機能が強化される予定です。たとえば、ASCII LaTeX よりもはるかに解読しやすい Unicode LaTeX の分解オプションの追加などが検討されています。また、Word で数式の自動変換オプションが実現すれば、わざわざ制御文字を解読しなくても、入力内容を簡単に確認できるようになります。今回実現した機能を、LaTeX を使い慣れたユーザーの皆様にお役立ていただければ幸いです。

ファイル復元時のiSCSIターゲットエラー (Exception caught while connecting to Target) の対処法

$
0
0

こんにちは。Azure サポート チーム藤本です。

Azurr Backup にて取得したバックアップ データからファイルの回復を実施した際、復元ポイントに接続できず iSCSI ターゲット エラーが記録される原因と対処法をご紹介させて頂きます。

 

[現象]
復元ポイント接続のスクリプトを実行した際、iSCSIターゲットエラーとなり接続ができないメッセージが表示されます。
WARNING: Waiting for service ‘Microsoft iSCSI Initiator Service (MSiSCSI)’ to start…
Exception caught while connecting to Target. Please retry after some time.

 

[考えられる原因]
File Recovery スクリプト実行仮想マシンから Azure 上の回復ポイントにアクセスできていないことが考えられます。
"File Recovery" をご利用いただくためには、対象の仮想マシンが以下の条件を満たしている必要があります。

 

1. go.microsoft.com へ接続可能であること。
2. 取得されたバックアップに接続するため、インターネットへの HTTP (80) 接続とHTTPS (443) 接続が可能であること。
3. iSCSI 接続のため 3260 ポートへ接続可能であること。

 

上記のいずれかの条件が満たされていないことで、ターゲットへの接続が出来ない為、上記のエラーが記録されます。

[ご確認いただきたい内容]
ネットワーク設定で上記の条件を満たしているか、ネットワークセキュリティー グループ (NSG) の設定や、対象の仮想マシンのFirewall やProxy の設定などご確認ください。

参考情報
※Azure 仮想マシンのバックアップからファイルを回復する (プレビュー)
https://docs.microsoft.com/ja-jp/azure/backup/backup-azure-restore-files-from-vm
※Azure での Linux 仮想マシンのバックアップ
https://docs.microsoft.com/ja-jp/azure/virtual-machines/linux/tutorial-backup-vms


Tip of the Day: Windows Server Software Defined (WSSD)

$
0
0

Today's Tip...

Quote from our blog:

"The island of Bora Bora. The finish line at a marathon. A software defined datacenter. What do they have in common? Being there is easy - getting there is the hard part. However in the latter case, at least, you can let someone else do the hard part for you."

Turn to Windows Server Software-Defined certified partners for the solution!

References:

Azure Stack WAP Connector – Part 1

$
0
0

Azure Stack WAP connector is preview now. With WAP Connector, you can enable access from the Azure Stack user portal to tenant virtual machines running on Windows Azure Pack. Tenants can use the Azure Stack portal to manage their existing IaaS virtual machines and virtual networks. These resources are made available on Windows Azure Pack through the underlying Service Provider Foundation (SPF) and Virtual Machine Manager (VMM) components. For more information about WAP connector, please refer to the link below.
https://docs.microsoft.com/en-us/azure/azure-stack/azure-stack-manage-windows-azure-pack

When we deployed WAP connector, the most challenging part is integrate the identity management systems between WAP and Azure Stack. Right now we support the following ID management scenarios.

Azure Stack Id Manager

CPS/WAP Id Manager

Notes

AAD

AD-ADFS AD is synchronized with AAD
AAD through ACS CPS using AAD instead of local
3rd Party Id Manager Both sharing AAD

ADFS

AD-ADFS Both sharing same customer AD
AAD through ACS Both sharing same AAD
3rd Party Id Manager Both sharing 3rd Party solution

If you're using ASDK, you probably would find it's more challenging coz in ASDK (one-node deployment) the whole Azure Stack is behind NAT.

In this blog, I will demonstrate how to integrate ASDK and WAP's identity management systems. In my setup, Azure Stack uses built-in ADFS (adfs.local.azurestack.external) and WAP uses a separate ADFS (fs.blue.cloud).

 

Pre-requisites

  • Deploy Azure Stack Development Kit (ASDK) and use ADFS type. https://docs.microsoft.com/en-us/azure/azure-stack/azure-stack-run-powershell-script#deploy-the-development-kit
  • Prepare a Separate AD forest (BLUE.CLOUD in my setup).
  • Deploy a WAP environment. In my setup, I put all the WAP roles on single VM (BLUE-WAP.BLUE.CLOUD). I hosted VMM and SPF are on another VM (BLUE-VMM.BLUE.CLOUD)

Prepare Certificates for *.BLUE.CLOUD

1. On BLUE-WAP.BLUE.CLOUD, run the following cmdlets to install CA.

Install-WindowsFeature -Name ADCS-Cert-Authority -IncludeAllSubFeature -IncludeManagementTools

2. Open Server Manager and following the wizard to configure the CA as standalone CA.

3. Create certificate request configuration file called "Bluecloud-Cert.inf".

[Version]
Signature="$Windows NT$"

[NewRequest]
Subject = "CN=*.blue.cloud"   ; For a wildcard use "CN=*.CONTOSO.COM" for example
; For an empty subject use the following line instead or remove the Subject line entierely
; Subject =
Exportable = TRUE                  ; Private key is not exportable
KeyLength = 2048                    ; Common key sizes: 512, 1024, 2048, 4096, 8192, 16384
KeySpec = 1                         ; AT_KEYEXCHANGE
KeyUsage = 0xA0                     ; Digital Signature, Key Encipherment
MachineKeySet = True                ; The key belongs to the cph01 computer account
ProviderName = "Microsoft RSA SChannel Cryptographic Provider"
ProviderType = 12
SMIME = FALSE
RequestType = CMC

; At least certreq.exe shipping with Windows Vista/Server 2008 is required to interpret the [Strings] and [Extensions] sections below

[Strings]
szOID_SUBJECT_ALT_NAME2 = "2.5.29.17"
szOID_ENHANCED_KEY_USAGE = "2.5.29.37"
szOID_PKIX_KP_SERVER_AUTH = "1.3.6.1.5.5.7.3.1"
szOID_PKIX_KP_CLIENT_AUTH = "1.3.6.1.5.5.7.3.2"

[Extensions]
%szOID_SUBJECT_ALT_NAME2% = "{text}"
_continue_ = "dns=*.blue.cloud&"
_continue_ = "dns=enterpriseregistration.blue.cloud&"
_continue_ = "dns=fs.blue.cloud&"
_continue_ = "dns=certauth.fs.blue.cloud&"
%szOID_ENHANCED_KEY_USAGE% = "{text}%szOID_PKIX_KP_SERVER_AUTH%,%szOID_PKIX_KP_CLIENT_AUTH%"

[RequestAttributes]
4. Run the following cmdlets and submit a certificate request.
cmd.exe /c "certreq -new Bluecloud-Cert.inf Bluecloud-Cert.req"
cmd.exe /c "certreq -submit Bluecloud-Cert.req"
5. Open CA management console and issue the certificate and then copy it to the file "C:Bluecloud-Cert.cer"
6. Run the following cmdlets and import the certificate to the local machine.
cmd.exe /c "certreq -accept Bluecloud-Cert.cer"

7. Export certificates.

$cert = get-Childitem cert:localmachinemy | where-object {$_.Subject -eq "CN=*.blue.cloud"}
$PfxPass = ConvertTo-SecureString "123" -AsPlainText -Force
Export-pfxCertificate -Cert $cert -FilePath c:Bluecloud-Cert.pfx -Password $PfxPass
$CAcert = Get-Childitem cert:localmachineroot | where-object {$_.Subject -eq "CN=blue-BLUE-WAP-CA"}
Export-Certificate -Cert $CAcert -FilePath c:Bluecloud-CACert.cer

8. Configure all the Web Sites on BLUE-WAP to use the new certificate.

Import-Module WebAdministration
$sslBindings = Get-childItem IIS:SslBindings
$sslBindings | foreach-Object{$path = "IIS:SslBindings" + $_.IPAddress.IPAddressToString + "!" + $_.port; Remove-Item $path; $cert | new-item $path}

 

Install ADFS in Domain "BLUE.CLOUD"

1. On the domain controller "BLUE-DC.BLUE.CLOUD", copy the above "bluecloud-cert.pfx" and "bluecloud-CAcert.cer" to the folder "C:". Then run the following cmdlets to install and configure ADFS.

Install-WindowsFeature -Name ADFS-Federation -IncludeAllSubFeature -IncludeManagementTools

# Set these values:
$domainName = 'blue.cloud'
$adfsPrefix = 'fs'
$username = 'administrator'
$password = 'User@123'
$dnsName = ($adfsPrefix + "." + $domainName)
$PfxPass = ConvertTo-SecureString "123" -AsPlainText -Force
$securePassword = ConvertTo-SecureString -String $password -Force -AsPlainText
$adfsServiceCredential = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList ($domainname + '' + $username), $securePassword

Import-Certificate -FilePath C:Bluecloud-CACert.cer -CertStoreLocation cert:localmachineroot
$cert = Import-pfxCertificate -FilePath C:Bluecloud-Cert.pfx -CertStoreLocation cert:localmachinemy -Exportable -Password $PfxPass

# Configure AD FS
Install-AdfsFarm `
    -CertificateThumbprint $cert.Thumbprint `
    -FederationServiceName $dnsName `
    -ServiceAccountCredential $adfsServiceCredential `
    -OverwriteConfiguration

Set-ADFSProperties -IgnoreTokenBinding $True
Set-ADFSWebConfig –HRDCookieEnabled $false
cmd.exe /c "setspn -U -S http/fs.blue.cloud administrator"
cmd.exe /c "setspn -U -S http/blue-dc.blue.cloud administrator"

 

Configure the Name Resolution between Azure Stack and WAP

  1. On ASDK Host Machine, open DNS management console and connect to AzS-DC01.AZURESTACK.LOCAL.
  2. Add a conditional forwarding and forward the query for domain "BLUE.LCOUD" to IP address of the DNS server "BLUE-DC.BLUE.CLOUD".
  3. On BLUE-DC.BLUE.CLOUD, open DNS management console and add a new DNS zone called "LOCAL.AZURESTACK.EXTERNAL".
  4. Add an HOST record (A record) under that new created DNS zone "ADFS.LOCAL.AZURESTACK.EXTERNAL" and point to the external IP of AzS-BGPNAT01.

Configure the Trust between WAP Tenant Portal, Azure Stack's ADFS and BLUE.CLOUD's ADFS

1. On ASDK Host machine, create a text file "C:Rules.txt". Copy and paste the following content to that file.

@RuleTemplate = "LdapClaims"
@RuleName = "LDAP UPN"
c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname", Issuer == "AD AUTHORITY"]
 => issue(store = "Active Directory", types = ("http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn"), query = ";userPrincipalName;{0}", param = c.Value);

@RuleTemplate = "LdapClaims"
@RuleName = "LDAP Groups"
c:[Type == "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname", Issuer == "AD AUTHORITY"]
 => issue(store = "Active Directory", types = ("http://schemas.xmlsoap.org/claims/Group"), query = ";tokenGroups(domainQualifiedName);{0}", param = c.Value);

@RuleTemplate = "PassThroughClaims"
@RuleName = "Passthru UPN"
c:[Type == "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn"]
 => issue(claim = c);

@RuleTemplate = "PassThroughClaims"
@RuleName = "Passthru Groups"
c:[Type == "http://schemas.xmlsoap.org/claims/Group"]
 => issue(claim = c);

3. Copy Bluecloud-CACert.cer to the ASDK Host machine's C drive.

4. Open GPMC.MSC and edit default domain policy. Import Bluecloud-CACert.cer to the Root CA store in that GPO.

5. On ASDK Host machine, run the following cmdlets.

#Save Azure Stack's ADFS Metadata
invoke-webrequest https://adfs.local.azurestack.external/FederationMetadata/2007-06/FederationMetadata.xml -OutFile "C:FederationMetadata-AzureStack.xml"
#Save Azure Stack's CA Certificate
Copy-Item \su1fileserverSU1_Infrastructure_1AzureStackCertStoreInternalCurrentRootCertAzureStackCertificationAuthority.cer C:

#Publish ADFS's HTTPS port
Enter-PSSession AzS-BGPNAT01.azurestack.local
$externalNic = Get-NetIPConfiguration | ? { [string]$_.IPv4Address -ne "192.168.200.1" }
$externalNicName = $externalNic.InterfaceAlias
$externalIP = $externalNic.IPv4Address.IPAddress + "/" + $externalNic.IPv4Address.PrefixLength
Add-NetNatExternalAddress -NatName BGPNAT -IPAddress $externalNic.IPv4Address.IPAddress -PortStart 443 -PortEnd 443
Add-NetNatStaticMapping -ExternalIPAddress $externalIP -ExternalPort 443 -InternalIPAddress "192.168.102.5" -InternalPort 443 -NatName BGPNAT -Protocol TCP

#Copy Rules.txt to AzS-ADFS01
Copy-Item C:Rules.txt \AzS-ADFS01.azurestack.localC$
#Create trust between BLUE.CLOUD ADFS and Azure Stack ADFS
Enter-PSSession AzS-ADFS01.azurestack.local
Set-ADFSWebConfig –HRDCookieEnabled $false
Set-ADFSProperties -IgnoreTokenBinding $True
$policy = Get-AdfsAccessControlPolicy -Name "Permit everyone"
Add-ADFSRelyingPartyTrust -Name "Blue Cloud" `
    -MetadataUrl https://fs.blue.cloud/federationmetadata/2007-06/federationmetadata.xml `
    -AutoUpdateEnabled:$true  `
    -MonitoringEnabled:$true `
    -IssuanceTransformRulesFile "C:rules.txt" `
    -AccessControlPolicyName $policy.name `
    -ClaimsProviderName @("Active Directory") `
    -EnableJWT $true

6. On BLUE-DC.BLUE.CLOUD, copy "AzureStackCertificationAuthority.cer" and "Rules.txt" to C drive.

7. Run the cmdlets below to configure the ADFS.

$policy = Get-AdfsAccessControlPolicy -Name "Permit everyone"
Add-AdfsRelyingPartyTrust -Name "WAP Tenant Portal" `
    -MetadataUrl "https://blue-wap.blue.cloud:30081/FederationMetadata/2007-06/FederationMetadata.xml" `
    -EnableJWT:$true `
    -AutoUpdateEnabled:$true `
    -IssuanceTransformRulesFile "C:rules.txt" `
    -ClaimsProviderName @() `
    -AccessControlPolicyName $policy.name
Add-AdfsClaimsProviderTrust -Name "Azure Stack" `
    -MetadataFile "C:FederationMetadata-AzureStack.xml" `
    -AutoUpdateEnabled:$false `
    -AcceptanceTransformRulesFile "C:rules.txt"
Set-AdfsRelyingPartyTrust -TargetName "WAP Tenant Portal" -ClaimsProviderName @()

8. On BLUE-WAP.BLUE.CLOUD, copy "AzureStackCertificationAuthority.cer" to C drive. Then run the following cmdlets.

Import-Certificate -FilePath C:AzureStackCertificationAuthority.cer -CertStoreLocation cert:localmachineroot

$fqdn = 'fs.blue.cloud'
$dbServer = 'blue-wap.blue.cloud'
$dbpassword= 'User@123'
$portalConfigStoreConnectionString = [string]::Format('Data Source={0};Initial Catalog=Microsoft.MgmtSvc.PortalConfigStore;User ID=sa;Password={1}', $dbServer, $dbPassword)

# Configure Tenant to use ADFS
Set-MgmtSvcRelyingPartySettings -Target Tenant `
    -MetadataEndpoint https://$fqdn/FederationMetadata/2007-06/FederationMetadata.xml `
    -DisableCertificateValidation `
    -ConnectionString $portalConfigStoreConnectionString

 

Verify the user AzureStackAdmin@AzureStack.local can login both Azure Stack and WAP

On ASDK Host machine, user the account "AzureStackAdmin@AzureStack.local" to login Azure Stack Tenant portal "https://portal.local.azurestack.external" and WAP tenant portal "https://blue-wap.blue.cloud:30081".

 

In the next blog, we will deploy WAP connector.

 

Query to find content with “enable for on-demand distribution” option set

$
0
0

If you need to determine which of your ConfigMgr packages, applications or update deployment packages has the "enable for on-demand distribution" option enabled, this query will help.  Run it in SQL studio against the ConfigMgr database.  You can adjust the where statement to also look for other configuration settings, like where the persist in cache option is set.  Enjoy.

 

SELECT  pkg.*,

   (PkgFlags&0x01000000)/0x01000000 AS PKG_DO_NOT_DOWNLOAD,
   (PkgFlags&0x02000000)/0x02000000 AS PKG_PERSIST_IN_CACHE,
   (PkgFlags&0x04000000)/0x04000000 AS PKG_USE_BINARY_DELTA_REP,
   (PkgFlags&0x10000000)/0x10000000 AS PKG_NO_PACKAGE,
   (PkgFlags&0x20000000)/0x20000000 AS PKG_USE_SPECIAL_MIF,
   (PkgFlags&0x40000000)/0x40000000 AS PKG_DISTRIBUTE_ON_DEMAND
 FROM   dbo.v_Package pkg  
 
 where ((PkgFlags&0x40000000)/0x40000000)  = 1

Dynamics 365 LinkedIn Integration

$
0
0

LinkedIn and Dynamics 365 (CRM) are world leading sales tools and now they are coming together, to help you save time toggling back and forth between your Dynamics 365 (CRM) and LinkedIn Sales Navigator, and to focus on what matters most—selling

The LinkedIn Sales Navigator for Microsoft Dynamics 365 (CRM) is designed to create a seamless experience between Sales Navigator and Microsoft Dynamics 365 (CRM), saving your reps valuable time.

  • View LinkedIn information where you’re already tracking other sales activity
  • Mention icebreakers to identify commonalities between you and your prospects
  • Uncover the best way to get introduced through TeamLink
  • Find new contacts directly with Recommended Leads
  • Get sales updates including news mentions and job changes when viewing CRM records

 

Below you'll find a brief guide to implementing the solution and some of the benefits this will bring to you sales force.

Import the Solution

Download the LinkedIn Solution for Dynamics to a folder on your computer. Do not unzip the file.

Log into your Dynamics 365 (CRM) environment

Navigate to Settings > Solutions to open the Solutions grid

Click Import (1) to open the Select Solution Package dialog

In the Solution Package Dialog click Choose File to browse for the solution file you downloaded and saved > click Next to see the Solution Information > click Next to see the Import Options > click Import, and wait for the import process to complete

The imported solution is listed at the top of the list of solutions. No further configuration is needed at this point and you can close out of Solutions.

Sign in to LinkedIn Sales Navigator

Next up is to sign into LinkedIn Sales Navigator from Dynamics 365.

Note:

All Dynamics 365 (CRM) integrations require Sales Navigator Team edition or above. If you don't have Sales Navigator you can contact us to schedule a Sales Navigator demo and get a free trial for your team

To sign into LinkedIn Sales Navigator and link your Dynamics 365 instance with your LinkedIn account, open a record, eg an Account


Click the Form Sections chooser (1) and then LinkedIn Company Profile (2)

Click Sign in to open the sign in dialog

Provide your credentials to LinkedIn

Using LinkedIn Sales Navigator for Dynamics 365 (CRM)

Once logged in you will see content for the selected record in the Sales Navigator widget. In the widget for LinkedIn Company Profile you'll see a basic information section to the left, and three headers; Recommended Leads (1), Connections, and News

Use the Recommended Leads section to find new leads directly in Dynamics 365 (CRM)


Uncover the best way to get introduced through Connections, and get Account & Lead Updates including news mentions and job changes in the News section

You can find LinkedIn’s Sales Navigator widget everywhere you need to validate your customer or prospect data: Account, Opportunity, Contact, and Lead records in your Dynamics 365 (CRM)

If you open a contact record you'll find new options for Activities (2), including the new Pointdrive Presentation option (allowing you to share and track content with your leads)

Note that for the contact records you also have a LinkedIn Member Profile widget

The LinkedIn Member Profile widget shows a basic information section to the left, and three headers; Icebreakers (1), Get Introduced, and Related Leads

Use the Icebreakers section to identify commonalities between you and your prospects, the propects recent activity and more.

The Get Introduced section help you uncover the best way to get introduced through your own network or via you co-workers network (TeamLink)


The Related Leads section you can find related leads to your prospect to map out the entire buying committee

I hope you will enjoy this unique way of saving time toggling back and forth between your Dynamics 365 (CRM) and LinkedIn Sales Navigator to focus on what matters most—selling!

See also

  • Presentation from Microsoft Inspire 2017: "Microsoft Relationship Sales - combining LinkedIn Sales Navigator and Dynamics 365 for Sales" -  link

Čtení na léto o Windows, Office, Dynamics CRM nebo SQL Serveru ZDARMA!

$
0
0

Pokud ještě nemáte sestavený seznam četby na letní dovolenou, zbystřete! Letos stejně jako každý rok uvolňuje Microsoft desítky elektronických knih zdarma. Každý si může stáhnout tolik knih, kolik jen chce případně tolik, kolik pojme paměť vašeho Kindle.

Přímo v původním blog postu najdete návod na stažení všech knih zároveň. Pokud vás zajímají jen některé, v níže uvedeném seznamu si je můžete stáhnout ve vhodném formátu.

Category Title Format
Azure Microsoft Azure Essentials Azure Automation PDF
MOBI
EPUB
Azure Microsoft Azure Essentials Azure Machine Learning PDF
MOBI
EPUB
Azure Microsoft Azure Essentials Fundamentals of Azure PDF
MOBI
EPUB
Azure Microsoft Azure Essentials Fundamentals of Azure, Second Edition PDF
Azure Microsoft Azure Essentials Fundamentals of Azure, Second Edition Mobile PDF
Azure Microsoft Azure Essentials Migrating SQL Server Databases to Azure – Mobile PDF
Azure Microsoft Azure Essentials Migrating SQL Server Databases to Azure 8.5X11 PDF
Azure Microsoft Azure ExpressRoute Guide PDF
Azure Overview of Azure Active Directory DOC
Azure Rapid Deployment Guide For Azure Rights Management PDF
Azure Rethinking Enterprise Storage: A Hybrid Cloud Model PDF
MOBI
EPUB
BizTalk BizTalk Server 2016 Licensing Datasheet PDF
BizTalk BizTalk Server 2016 Management Pack Guide DOC
Cloud Enterprise Cloud Strategy PDF
MOBI
EPUB
Cloud Enterprise Cloud Strategy – Mobile PDF
Developer .NET Microservices: Architecture for Containerized .NET Applications PDF
Developer .NET Technology Guidance for Business Applications PDF
Developer Building Cloud Apps with Microsoft Azureâ„¢: Best practices for DevOps, data storage, high availability, and more PDF
MOBI
EPUB
Developer Containerized Docker Application Lifecycle with Microsoft Platform and Tools PDF
Developer Creating Mobile Apps with Xamarin.Forms, Preview Edition 2 PDF
MOBI
EPUB
Developer Creating Mobile Apps with Xamarin.Forms: Cross-platform C# programming for iOS, Android, and Windows PDF
MOBI
EPUB
Developer Managing Agile Open-Source Software Projects with Microsoft Visual Studio Online PDF
MOBI
EPUB
Developer Microsoft Azure Essentials Azure Web Apps for Developers PDF
MOBI
EPUB
Developer Microsoft Platform and Tools for Mobile App Development PDF
Developer Microsoft Platform and Tools for Mobile App Development – Mobile PDF
Developer Programming Windows Store Apps with HTML, CSS, and JavaScript, Second Edition PDF
MOBI
EPUB
Developer Team Foundation Server to Visual Studio Team Services Migration Guide PDF
Dynamics 5 cool things you can do with CRM for tablets PDF
Dynamics Create Custom Analytics in Dynamics 365 with Power BI PDF
Dynamics Create of Customize System Dashboards PDF
Dynamics Create Your First CRM Marketing Campaign PDF
Dynamics CRM Basics for Outlook basics PDF
Dynamics CRM Basics for Sales Pros and Service Reps PDF
Dynamics Give Great Customer Service with CRM PDF
Dynamics Go Mobile with CRM for Phones – Express PDF
Dynamics Go Mobile with CRM for Tablets PDF
Dynamics Import Contacts into CRM PDF
Dynamics Introducing Microsoft Social Engagement PDF
Dynamics Introduction to Business Processes PDF
Dynamics Meet Your Service Goals with SLAs and Entitlements PDF
Dynamics Microsoft Dynamics CRM 2016 Interactive Service Hub User Guide PDF
Dynamics Microsoft Dynamics CRM 2016 On-Premises Volume Licensing and Pricing Guide PDF
Dynamics Microsoft Dynamics CRM for Outlook Installing Guide for use with Microsoft Dynamics CRM Online PDF
Dynamics Microsoft Dynamics CRM Resource Guide 2015 PDF
Dynamics Microsoft Social Engagement for CRM PDF
Dynamics Product Overview and Capability Guide Microsoft Dynamics NAV 2016 PDF
Dynamics RAP as a Service for Dynamics CRM PDF
Dynamics Set Up A Social Engagement Search For Your Product PDF
Dynamics Social is for Closers PDF
Dynamics Start Working in CRM PDF
Dynamics Your Brand Sux PDF
General 10 essential tips and tools for mobile working PDF
General An employee’s guide to healthy computing PDF
General Guide for People who have Language or Communication Disabilities DOC
General Guide for People who have Learning Disabilities DOC
Licensing Introduction to Per Core Licensing and Basic Definitions PDF
Licensing Licensing Windows and Microsoft Office for use on the Macintosh PDF
Licensing VLSC Software Assurance Guide PDF
Licensing Windows Server 2016 and System Center 2016 Pricing and Licensing FAQs PDF
Office Azure AD/Office 365 seamless sign-in PDF
Office Content Encryption in Microsoft Office 365 PDF
Office Controlling Access to Office 365 and Protecting Content on Devices PDF
Office Data Resiliency in Microsoft Office 365 PDF
Office Excel 2016 keyboard shortcuts and function keys DOC
Office Excel Online Keyboard Shortcuts PDF
Office File Protection Solutions in Office 365 PDF
Office Get Started With Microsoft OneDrive PDF
Office Get Started With Microsoft Project Online PDF
Office Getting started with MyAnalytics DOC
Office How To Recover That Un-Saved Office Document PDF
Office InfoPath 2013 Keyboard Shortcuts PDF
Office Keyboard shortcuts for Microsoft Word 2016 for Windows DOC
Office Licensing Microsoft Office 365 ProPlus Subscription Service in Volume Licensing PDF
Office Licensing Microsoft Office software in Volume Licensing PDF
Office Microsoft Classroom Deployment PDF
Office Microsoft Excel 2016 for Mac Quick Start Guide PDF
Office Microsoft Excel 2016 Quick Start Guide PDF
Office Microsoft Excel Mobile Quick Start Guide PDF
Office Microsoft Excel VLOOKUP Troubleshooting Tips PDF
Office Microsoft OneNote 2016 for Mac Quick Start Guide PDF
Office Microsoft OneNote 2016 Quick Start Guide PDF
Office Microsoft OneNote 2016 Tips and Tricks PDF
Office Microsoft OneNote Mobile Quick Start Guide PDF
Office Microsoft Outlook 2016 for Mac Quick Start Guide PDF
Office Microsoft Outlook 2016 Quick Start Guide PDF
Office Microsoft Outlook 2016 Tips and Tricks PDF
Office Microsoft PowerPoint 2016 Quick Start Guide PDF
Office Microsoft PowerPoint 2016 for Mac Quick Start Guide PDF
Office Microsoft PowerPoint Mobile Quick Start Guide PDF
Office Microsoft Word 2016 for Mac Quick Start Guide PDF
Office Microsoft Word 2016 Quick Start Guide PDF
Office Microsoft Word Mobile Quick Start Guide PDF
Office Microsoft® Office 365: Connect and Collaborate Virtually Anywhere, Anytime PDF
Office Monitoring and protecting sensitive data in Office 365 DOC
Office Office 365 Dedicated Platform vNext Service Release PDF
Office Office 365 Licensing Brief PDF
Office OneNote Online Keyboard Shortcuts PDF
Office Outlook Web App Keyboard Shortcuts PDF
Office Own Your Future: Update Your Skills with Resources and Career Ideas from Microsoft® XPS
PDF
MOBI
EPUB
Office PowerPoint Online Keyboard Shortcuts PDF
Office Security Incident Management in Microsoft Office 365 PDF
PDF
Office SharePoint Online Dedicated & OneDrive for Business Dedicated vNext Service Release PDF
Office Skype for Business User Tips & Tricks for Anyone PDF
Office Switching from Google Apps to Office 365 for business PDF
Office Tenant Isolation in Microsoft Office 365 PDF
Office Windows 10 Tips and Tricks PDF
Office Word Online Keyboard Shortcuts PDF
Office Working with SmartArt Graphics Keyboard Shortcuts PDF
Power BI Ask, find, and act—harnessing the power of Cortana and Power BI DOC
Power BI Bidirectional cross-filtering in SQL Server Analysis Services 2016 and Power BI Desktop DOC
Power BI Configuring Power BI mobile apps with Microsoft Intune DOC
Power BI Getting started with the Power BI for Android app DOC
Power BI Getting Started with the Power BI for iOS app DOC
Power BI How to plan capacity for embedded analytics with Power BI Premium PDF
Power BI Introducing Microsoft Power BI PDF
Power BI Introducing Microsoft Power BI – Mobile PDF
Power BI Microsoft Power BI Premium Whitepaper PDF
Power BI Power BI mobile apps—enabling data analytics on the go DOC
Power BI Propelling digital transformation in manufacturing operations with Power BI DOC
Power BI Using Power BI to visualize data insights from Microsoft Dynamics CRM Online DOC
PowerShell Microsoft Dynamics GP 2015 R2 PowerShell Users Guide PDF
PowerShell PowerShell Integrated Scripting Environment 3.0 PDF
PowerShell Simplify Group Policy administration with Windows PowerShell PDF
PowerShell Windows PowerShell 3.0 Examples PDF
PowerShell Windows PowerShell 3.0 Language Quick Reference PDF
PowerShell WINDOWS POWERSHELL 4.0 LANGUAGE QUICK REFERENCE PDF
PowerShell Windows PowerShell 4.0 Language Reference Examples PDF
PowerShell Windows PowerShell Command Builder User’s Guide PDF
PowerShell Windows PowerShell Desired State Configuration Quick Reference PDF
PowerShell WINDOWS POWERSHELL INTEGRATED SCRIPTING ENVIRONMENT 4.0 PDF
PowerShell Windows PowerShell Web Access PDF
PowerShell WMI in PowerShell 3.0 PDF
PowerShell WMI in Windows PowerShell 4.0 PDF
SharePoint Configuring Microsoft SharePoint Hybrid Capabilities PDF
SharePoint Configuring Microsoft SharePoint Hybrid Capabilities – Mobile PDF
SharePoint Microsoft SharePoint Server 2016 Architectural Models PDF
SharePoint Planning and Preparing for Microsoft SharePoint Hybrid – 8.5 X 11 PDF
SharePoint Planning and Preparing for Microsoft SharePoint Hybrid – Mobile PDF
SharePoint RAP as a Service for SharePoint Server PDF
SharePoint SharePoint Online Dedicated Service Description PDF
SharePoint SharePoint Products Keyboard Shortcuts PDF
SharePoint SharePoint Server 2016 Databases – Quick Reference Guide PDF
SharePoint SharePoint Server 2016 Quick Start Guide PDF
SQL Server Backup and Restore of SQL Server Databases PDF
SQL Server Data Science with Microsoft SQL Server 2016 PDF
SQL Server Deeper insights across data with SQL Server 2016 – Technical White Paper PDF
SQL Server Deploying SQL Server 2016 PowerPivot and Power View in a Multi-Tier SharePoint 2016 Farm DOC
SQL Server Deploying SQL Server 2016 PowerPivot and Power View in SharePoint 2016 DOC
SQL Server Introducing Microsoft Azureâ„¢ HDInsightâ„¢ PDF
MOBI
EPUB
SQL Server Introducing Microsoft Data Warehouse Fast Track for SQL Server 2016 PDF
SQL Server Introducing Microsoft SQL Server 2016: Mission-Critical Applications, Deeper Insights, Hyperscale Cloud, Preview 2 PDF
MOBI
EPUB
SQL Server Introducing Microsoft SQL Server 2016: Mission-Critical Applications, Deeper Insights, Hyperscale Cloud, Preview 2 – Mobile PDF
SQL Server Introducing Microsoft Technologies for Data Storage, Movement and Transformation DOC
SQL Server Microsoft SharePoint Server 2016 Reviewer’s Guide PDF
SQL Server Microsoft SQL Server 2016 Licensing Datasheet PDF
SQL Server Microsoft SQL Server 2016 Licensing Guide PDF
SQL Server Microsoft SQL Server 2016 Mission-Critical Performance Technical White Paper PDF
SQL Server Microsoft SQL Server 2016 New Innovations PDF
SQL Server Microsoft SQL Server 2016 SP1 Editions PDF
SQL Server Microsoft SQL Server In-Memory OLTP and Columnstore Feature Comparison PDF
SQL Server RAP as a Service for SQL Server PDF
SQL Server SQLCAT’s Guide to: Relational Engine PDF
SQL Server Xquery Language Reference PDF
Surface Surface Book User Guide PDF
Surface Surface Pro 4 User Guide PDF
System Center Guide to Microsoft System Center Management Pack for SQL Server 2016 Reporting Services (Native Mode) DOC
System Center Guide to System Center Management Pack for Windows Print Server 2016 DOC
System Center Introducing Microsoft System Center 2012 R2 PDF
MOBI
EPUB
System Center Microsoft System Center Building a Virtualized Network Solution, Second Edition PDF
MOBI
EPUB
System Center Microsoft System Center Data Protection for the Hybrid Cloud PDF
MOBI
EPUB
System Center Microsoft System Center Deploying Hyper-V with Software-Defined Storage & Networking PDF
MOBI
EPUB
System Center Microsoft System Center Extending Operations Manager Reporting PDF
MOBI
EPUB
System Center Microsoft System Center Introduction to Microsoft Automation Solutions PDF
MOBI
EPUB
System Center Microsoft System Center Operations Manager Field Experience PDF
MOBI
EPUB
System Center Microsoft System Center Software Update Management Field Experience PDF
MOBI
EPUB
System Center Microsoft System Center: Building a Virtualized Network Solution PDF
MOBI
EPUB
System Center Microsoft System Center: Cloud Management with App Controller PDF
MOBI
EPUB
System Center Microsoft System Center: Configuration Manager Field Experience PDF
MOBI
EPUB
System Center Microsoft System Center: Designing Orchestrator Runbooks PDF
MOBI
EPUB
System Center Microsoft System Center: Integrated Cloud Platform PDF
MOBI
EPUB
System Center Microsoft System Center: Network Virtualization and Cloud Computing PDF
MOBI
EPUB
System Center Microsoft System Center: Optimizing Service Manager PDF
MOBI
EPUB
System Center Microsoft System Center: Troubleshooting Configuration Manager PDF
MOBI
EPUB
System Center What’s new in System Center 2016 White Paper PDF
Virtualization Understanding Microsoft Virtualizaton R2 Solutions XPS
PDF
Windows Client Deploying Windows 10: Automating deployment by using System Center Configuration Manager PDF
MOBI
EPUB
Windows Client Deploying Windows 10: Automating deployment by using System Center Configuration Manager – Mobile PDF
Windows Client Getting the most out of Microsoft Edge DOC
Windows Client Introducing Windows 10 for IT Professionals PDF
MOBI
EPUB
Windows Client Introducing Windows 10 for IT Professionals, Preview Edition PDF
MOBI
EPUB
Windows Client Licensing Windows desktop operating system for use with virtual machines PDF
Windows Client Protecting your data with Windows 10 BitLocker DOC
Windows Client RAP as a Service for Windows Desktop PDF
Windows Client Shortcut Keys for Windows 10 DOC
Windows Client Use Reset to restore your Windows 10 PC DOC
Windows Client Volume Licensing Reference Guide Windows 10 Desktop Operating System PDF
Windows Client Windows 10 IT Pro Essentials Support Secrets PDF
PDF
MOBI
EPUB
Windows Client Windows 10 IT Pro Essentials Top 10 Tools PDF
MOBI
EPUB
Windows Client Windows 10 IT Pro Essentials Top 10 Tools – Mobile PDF
Windows Server Automating Windows Server 2016 configuration with PowerShell and DSC DOC
Windows Server Introducing Windows Server 2016 PDF
Windows Server Introducing Windows Server 2016 – Mobile PDF
Windows Server Introducing Windows Server 2016 Technical Preview PDF
Windows Server Introducing Windows Server 2016 Technical Preview – Mobile PDF
Windows Server Offline Assessment for Active Directory PDF
Windows Server RAP as a Service for Active Directory PDF
Windows Server RAP as a Service for Failover Cluster PDF
Windows Server RAP as a Service for Internet Information Services PDF
Windows Server RAP as a Service for Windows Server Hyper-V PDF
Windows Server Windows Server 2016 Licensing PDF

 

 

 

Viewing all 36188 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>