Posts

Showing posts from June, 2025

How schools are responding

As cyber threats and incidents continue to rise, schools must respond swiftly with cybersecurity measures to mitigate the potential effects of an attack.   Cybersecurity involves protecting the data, systems, and operations that make learning with technology possible in K-12 schools. Robust cybersecurity measures include: Steps for detecting, preventing, and responding to threats. Institutional participation and awareness from the youngest student all the way to educators, administrators, and IT professionals. Security technologies that detect, protect, and respond to incoming threats. User authentication:  Strong user authentication methods, like multifactor authentication (MFA) and two-factor authentication (2FA), help ensure that only authorized users access school systems. Data encryption:  Software that encrypts sensitive data helps safeguard student records and other confidential information. Content filters:  Use of content filters restricts access to inapprop...

Current landscape of K-12 cybersecurity

In this unit, you learn about the current K-12 cybersecurity landscape and are introduced to how you, as an educator, can prepare to implement cybersecurity measures in your classroom. Cyberattacks on K-12 schools have become a serious concern in recent years, as the education sector increasingly relies on digital technology to support teaching and learning activities. With the widespread adoption of online learning platforms, student information systems, and other digital tools, schools are a rich target for cybercriminals looking to exploit vulnerabilities in these systems. Moreover, schools are often seen as easier targets for cybercriminals when compared to other organizations, as they might have limited resources and technical expertise to secure their networks adequately. Cumulative reported incidents over the past five years show a significant increase in school-focused attacks. Since 2016, the K12 Security Information eXchange cataloged over 1,619 incidents, a rate of ...

Analyze and visualize data in a lakehouse

After data is ingested, transformed, and loaded, it's ready for others to use. Fabric items provide the flexibility needed for every organization so you can use the tools that work for you. Data scientists can use notebooks or Data wrangler to explore and train machine learning models for AI. Report developers can use the semantic model to create Power BI reports. Analysts can use the SQL analytics endpoint to query, filter, aggregate, and otherwise explore data in lakehouse tables. By combining the data visualization capabilities of Power BI with the centralized storage and tabular schema of a data lakehouse, you can implement an end-to-end analytics solution on a single platform. itil certification training courses malaysia

Explore and transform data in a lakehouse

  Transform and load data Most data requires transformations before loading into tables. You might ingest raw data directly into a lakehouse and then further transform and load into tables. Regardless of your ETL design, you can transform and load data simply using the same tools to ingest data. Transformed data can then be loaded as a file or a Delta table. Notebooks are favored by data engineers familiar with different programming languages including PySpark, SQL, and Scala. Dataflows Gen2 are excellent for developers familiar with Power BI or Excel since they use the PowerQuery interface. Pipelines provide a visual interface to perform and orchestrate ETL processes. Pipelines can be as simple or as complex as you need. java ee enterprise edition training courses malaysia

Access data using shortcuts

Another way to access and use data in Fabric is to use  shortcuts . Shortcuts enable you to integrate data into your lakehouse while keeping it stored in external storage. Shortcuts are useful when you need to source data that's in a different storage account or even a different cloud provider. Within your lakehouse you can create shortcuts that point to different storage accounts and other Fabric items like data warehouses, KQL databases, and other lakehouses. Source data permissions and credentials are all managed by OneLake. When accessing data through a shortcut to another OneLake location, the identity of the calling user will be utilized to authorize access to the data in the target path of the shortcut. The user must have permissions in the target location to read the data. Shortcuts can be created in both lakehouses and KQL databases, and appear as a folder in the lake. This allows Spark, SQL, Real-Time intelligence and Analysis Services to all utilize shortcuts when queryi...

Ingest data into a lakehouse

Ingesting data into your lakehouse is the first step in your ETL process. Use any of the following methods to bring data into your lakehouse. Upload : Upload local files. Dataflows Gen2 : Import and transform data using Power Query. Notebooks : Use Apache Spark to ingest, transform, and load data. Data Factory pipelines : Use the Copy data activity. This data can then be loaded directly into files or tables. Consider your data loading pattern when ingesting data to determine if you should load all raw data as files before processing or use staging tables. Spark job definitions  can also be used to submit batch/streaming jobs to Spark clusters. By uploading the binary files from the compilation output of different languages (for example, .jar from Java), you can apply different transformation logic to the data hosted on a lakehouse. Besides the binary file, you can further customize the behavior of the job by uploading more libraries and command line arguments. java programming trai...

Work with Microsoft Fabric lakehouses

Now that you understand the core capabilities of a Microsoft Fabric lakehouse, let's explore how to work with one. Create and explore a lakehouse When you create a new lakehouse, you have three different data items automatically created in your workspace. The  lakehouse  contains shortcuts, folders, files, and tables. The  Semantic model (default)  provides an easy data source for Power BI report developers. The  SQL analytics endpoint  allows read-only access to query data with SQL. You can work with the data in the lakehouse in two modes: lakehouse  enables you to add and interact with tables, files, and folders in the lakehouse. SQL analytics endpoint  enables you to use SQL to query the tables in the lakehouse and manage its relational semantic model. oracle java training courses malaysia

Load data into a lakehouse

Fabric lakehouses are a central element for your analytics solution. You can follow the ETL (Extract, Transform, Load) process to ingest and transform data before loading to the lakehouse. You can ingest data in many common formats from various sources, including local files, databases, or APIs. You can also create Fabric  shortcuts  to data in external sources, such as Azure Data Lake Store Gen2 or OneLake. Use the Lakehouse explorer to browse files, folders, shortcuts, and tables and view their contents within the Fabric platform. Ingested data can be transformed and then loaded using either Apache Spark with notebooks or Dataflows Gen2. Use Data Factory pipelines to orchestrate your different ETL activities and land the prepared data into your lakehouse. You can use your lakehouse for many reasons, including: Analyze using SQL. Train machine learning models. Perform analytics on real-time data. Develop reports in Power BI. Secure a lakehouse Lakehouse access is managed eith...

Explore the Microsoft Fabric lakehouse

A  lakehouse  presents as a database and is built on top of a data lake using Delta format tables. Lakehouses combine the SQL-based analytical capabilities of a relational data warehouse and the flexibility and scalability of a data lake. Lakehouses store all data formats and can be used with various analytics tools and programming languages. As cloud-based solutions, lakehouses can scale automatically and provide high availability and disaster recovery. Some benefits of a lakehouse include: Lakehouses use Spark and SQL engines to process large-scale data and support machine learning or predictive modeling analytics. Lakehouse data is organized in a  schema-on-read format , which means you define the schema as needed rather than having a predefined schema. Lakehouses support ACID (Atomicity, Consistency, Isolation, Durability) transactions through Delta Lake formatted tables for data consistency and integrity. Lakehouses are a single location for data engineers, data scie...

Copilot plugins and role requirements

Your role controls what activities you have access to, such as configuring settings, assigning permissions, or performing tasks. Copilot doesn't go beyond the access you have. Additionally, individual Microsoft plugins may have their own role requirements for accessing the service and data it represents. As an example, an analyst that has been assigned a security operator role or a Copilot workspace contributor role is able to access the Copilot portal and create sessions, but to utilize the Microsoft Sentinel plugin would need an appropriate role like Microsoft Sentinel Reader to access incidents in the workspace. To access the devices, privileges, and policies available through the Microsoft Intune plugin, that same analyst would need another service-specific role like the Intune Endpoint Security Manager role. Generally speaking, Microsoft plugins in Copilot use the OBO (on behalf of) model – meaning that Copilot knows that a customer has licenses to specific products and is aut...

Role permissions

  To ensure that the users can access the features of Copilot, they need to have the appropriate role permissions. Role permissions are configured per workspace. Permissions can be assigned using Microsoft Entra ID roles or Security Copilot roles. As a best practice, provide the least privileged role applicable for each user. The Microsoft Entra ID roles are: Global administrator Security administrator Security operator Security reader Although these Microsoft Entra ID roles grant users varying levels of access to Copilot, the scope of these roles extends beyond Copilot. For this reason, Security Copilot introduces two roles that function like access groups but aren't Microsoft Entra ID roles. Instead, they only control access to the capabilities of the Security Copilot platform. The Microsoft Security Copilot roles are: Copilot owner Copilot contributor The Security Administrator and Global Administrator roles in Microsoft Entra automatically inherit Copilot owner access. Only use...

Set up the default environment

  To set up the default environment, you need to have, at least, a Security Administrator role. During the setup of Security Copilot, you're prompted to configure settings. These include: SCU capacity - Select the capacity of SCUs previously provisioned. Each workspace must have its own capacity. Data storage - When an organization onboards to Copilot, one of the available settings determines where your customer data will be stored. Configuration of the data storage location applies at a workspace level. Microsoft Security Copilot operates in the Microsoft Azure data centers in the European Union (EUDB), the United Kingdom, the United States, Australia and New Zealand, Japan, Canada, and South America. Decide where your prompts are evaluated - You can restrict the evaluation within your geo or allow evaluation anywhere in the world. Logging audit data in Microsoft Purview - As part of the initial setup and listed under Owner settings in the standalone experience, you can choose to ...

Describe how to enable Microsoft Security Copilot

To start using Microsoft Security Copilot, organizations need to take steps to onboard the service and users. These include: Provision Copilot capacity Set up the default environment Assign role permissions Provision capacity Security Copilot operates on a provisioned capacity and an overage model. Provisioned capacity is billed by the hour while the overage capacity is billed on usage. You can flexibly provision Security Compute Units (SCUs) to accommodate regular workloads and adjust them anytime without long-term commitments. An SCU is the unit of measure of computing power used to run Copilot in both the standalone and embedded experiences. To manage unexpected demand spikes, you can allocate an overage amount to ensure that additional SCUs are available when initially provisioned units are depleted during unexpected workload spikes. Overage units are billed on-demand and can be set as unlimited or a maximum amount. This approach enables predictable billing while providing the flex...

Other prompting tips

  Some things to remember when coming up with your own prompts: Be specific, clear, and concise as much as you can about what you want to achieve. You can always start simply with your first prompt, but as you get more familiar with Copilot, include more details following the elements of an effective prompt. Basic prompt: Pearl Sleet actor Better prompt: Can you give me information about Pearl Sleet activity, including a list of known indicators of compromise and tools, tactics, and procedures (TTPs)? Iterate. Subsequent prompts are typically needed to further clarify what you need or to try other versions of a prompt to get closer to what you're looking for. Like all LLM-based systems, Copilot can respond to the same prompt in slightly different ways. Provide necessary context to narrow down where Copilot looks for data. Basic prompt: Summarize incident 15134. Better prompt: Summarize incident 15134 in Microsoft Defender XDR into a paragraph that I can submit to my manager and cre...

Describe the elements of an effective prompt

W e defined a prompt as the text-based, natural language input you provide in the prompt bar that instructs Microsoft Security Copilot to generate a response. Copilot provides promptbooks and prompt suggestions, which are helpful, particularly if you're just starting an incident investigation. At some point, however, you'll want and need to enter your own prompts. In those cases, the quality of the response that Copilot returns depends in large part on the quality of the prompt used. In general, a well-crafted prompt with clear and specific inputs leads to more useful responses by Copilot. Elements of an effective prompt Effective prompts give Copilot adequate and useful parameters to generate a valuable response. Security analysts or researchers should include the following elements when writing a prompt. Goal - specific, security-related information that you need Context - why you need this information or how you'll use it Expectations - format or target audience you want...

Process log

  During this process, Copilot generates a process log that is visible to the user. The user can see what capability is used to generate the response. This is important because it enables the user to determine whether the response was generated from a trusted source. In the screenshot that follows, the process log shows that Copilot chose the Incident Analysis capability. The process log also shows that the final output went through safety checks, which is part of Microsoft’s commitment to responsible AI. citrix certification malaysia 2

Describe how Microsoft Security Copilot processes prompt requests

So now that there's a basic understanding of plugins, capabilities, and how the user interacts with Microsoft Security Copilot through prompts, it’s worth taking a look under the hood to see how these components come together to process a prompt request and help security analysts. Process flow When a user submits a prompt, Copilot processes that prompt to generate the best possible response. Submit a prompt: The process starts when a user submits a prompt in the prompt bar. Orchestrator: Security Copilot sends the information to the Copilot backend referred to as the orchestrator. The orchestrator is Copilot’s system for composing capabilities together to answer a user’s prompt. It determines the initial context and builds a plan using all the available capabilities (skills). Build context: Once a plan is defined and built, Copilot executes that plan to get the required data context to answer the prompt. Plugins: In the course of executing the plan, Copilot analyzes all data and pa...

Terminology

  Agents A Microsoft Security Copilot agent is an advanced, AI-powered assistant built into Microsoft Security Copilot. These agents go beyond just answering questions—they can autonomously manage high-volume security and IT tasks. They’re deeply integrated with Microsoft’s security tools and can also work with partner solutions. Each agent is tailored for specific security scenarios, such as threat protection, identity management, or data security. These agents are designed to learn from feedback, adapt to your organization’s workflows, and operate securely within Microsoft’s Zero Trust framework. See the summary and resources unit for links to more information on Microsoft Security Copilot agents. Orchestrator The orchestrator is Copilot’s system for composing capabilities together to answer a user’s prompt. This function is illustrated in more detail in the subsequent unit that describes how Copilot processes prompt requests. dell emc certification malaysia

Workspaces

  Copilot workspaces are separate Copilot work environments within the tenant in which your Copilot instance is operating. To help you better understand the concept of workspaces, we'll use the analogy of house with multiple rooms. Each room is configured to be optimized for its function and the people that will use that room. When someone enters the house, they may have access to some rooms but not others. You can think of Copilot Workspaces fitting into this analogy. A Copilot workspace is analogous to a room in a house. You can also think of the house as analogous to a tenant. In the same way that a house has multiple rooms, the tenant in which Copilot is operating can have multiple workspaces. Through the tenant-switching capability in Security Copilot, a user can select in which tenant they'll be working. In our analogy, this is a Copilot user getting access to the house. Once the tenant is selected, a Copilot user can access and work in any workspace (room in the house) t...

Plugins and capabilities

Image
  In the previous unit, we mentioned that Copilot integrates with various sources through plugins, including Microsoft's own security products such as Microsoft Sentinel, Microsoft Defender XDR, and Microsoft Intune, non-Microsoft solutions, and open-source intelligence feeds. The integration enabled by the plugin, for any specific data source, provides Copilot with a collection of capabilities. Each capability is like a function in software, it’s designed to do a specialized task within the scope of the data source. For example, the plugin to Microsoft Defender XDR includes a collection of individual capabilities that are used only by Microsoft Defender XDR. These include: The ability to summarize an incident. Support incident response teams in resolving incidents through guided responses (a set of recommended actions based on the specific incident). The ability to analyze scripts and code. The ability to generate KQL queries from natural language input. The ability to generate in...