Loading...

Starting with Log Analytics: Part 2 - Importing your own data

2 A+ A-


This is the 2nd article of my blog blog series about Log Analytics. We will see how to use Intune and PowerShell to import datas, infomation from devices into Log Analytics in order to create your own reports. 


Other articles

- Part 1: Creating our first Log Analytics workspace 

- Part 2: Importing your own datas into the workspace (you are here) 

- Part 3: Creating your own lab from a CSV

- Part 3: Creating our first workbook (Soon)

- Part 4: Adding Intune data into Log Analytics (Soon)

- Part 5: Querying Log Analytics data with PowerShell and Graph (Soon)


Our example

In the following example we want to get below information:

- Device name

- User 

- Device model

- BIOS version

- Disk size

- Free disk space

- Free disk space percent

- Bitlocker status


Log Analytics workspace 

In the previous post, we have created our workspace in Log Analytics:




Required information

For our need, below information are required:

- Workspace ID

- Primary key

- Name of the log to create or update


You can find both Workspace ID and primary key in the Agents management part:


Now, we need to create a Custom Log.


Custom logs, tables and fields ?

To make it simple the Custom logs is a log that will contain specific custom information.


We will create and import data to the log using PowerShell and Proactive Remediation.

We can for instance imagine a Custom logs for different report like:

- BIOS version report

- Local admin report


In our case the Custom logs will be called: TestReport

Once the logs will be created (through Intune) you will find a new table in the Custom tables part.

1. Go to Workbooks

2. Go to Custom logs

3. Go to Custom tables

4. You will find TestReport_CL



The TestReport_CL will contain all information provided by the Proactive Remediation script.

Each info are integrated in a field.

See below:

1. Go to Custom fields

2. Go to the Table selection and check your table


3. All fields will be listed


4. Fields corresponds to a specific column


The Proactive Remediation script

Information to provide

In the script you will need to provide below information as variables:

- Workspace ID

- Primary key

- Name of the log to create or update


The Proactive Remediation script works as below:

1. Check if BIOS is uptodate or not (you can find it i the script)

2. Create an array with info

3. Convert the array to a JSON

4. Create a new (if not exists) Custom logs in Log Analytics

5. Import array information to the Custom logs

You can find more information about importing data to Azure monitor there.


Getting script

Click on the below picture to get the detection script allowing us to import datas into Log Analytics.


Creating the remediation package

1. Go to the Microsoft Endpoint manager admin center

2. Go to Reports

3. Go to Endpoint analytics

4. Go to Proactive remediations

5. Click on Create script package

6. Choose a name

7. Click on Next

8. Click on Detection script file

9. Choose the Detection_Script.ps1

12. Click on Next

13. Select a group

14. In the Schedule part, choose when the package should be run.

15. In our case we will run it every days

16. Click on Apply

17. Click on Next

18. Click on Create


slider 6700817034626837892

Enregistrer un commentaire

2 commentaires

Unknown a dit…

Hello, I am struggling with Part 2 of your lessons. You have a parameter called:

$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"

I'm not sure what to do here. I have my subscription ID and my Tenant ID but I don't know what my $CustomerID is. How do I find this?

Thank you.

Anonyme a dit…

can we add something to script in order to overwrite instead of appending to the logs cause its creating redundancy in this way

Accueil item

Award

Learn KQL in one month

Sponsors

You want to support me ?

Mes articles en français

Books in French


Stats