diTii.com Digital News Hub

Sign up with your email address to be the first to know about latest news and more.

I agree to have my personal information transfered to MailChimp (more information)


Microsoft Azure Cosmos DB Change Feed Processor Library, More Launched

Azure Cosmos Database (DB) Change Feed Processor Library introduced on Thursday, boosting scalability, as well as also now preserving simplicity of use. With change feed support, “you can integrate with many different services depending on what you need to do once changes.”

This follows the recent release of change feed support, which helps build powerful applications on top of Azure Cosmos DB, writes program manager for Cosmos DB.

Azure Cosmos DB is a fast and flexible, globally replicated database service used for storing high-volume transactional and operational data. As your data storage needs grow, it’s likely that multiple partitions will be used to store data. Although it’s possible to manually read changes from each partition, “the Change Feed Processor reads across partitions and distributes change feed event processing across multiple consumers.”

This library provides “a thread-safe, multi-process, safe runtime environment with checkpoint and partition lease management for change feed operations.”

When to use Change Feed Processor Library, Microsoft explains, it can be used when, you want:

  • pull updates from the change feed when data is stored across multiple partitions
  • moving or replicating data from one collection to another
  • parallel execution of actions triggered by updates to data and the change feed

To set up, install the Change Feed Processor Library Nuget package, and create a lease collection through an account close to the write region. “This collection will keep track of change feed reading progress per partition and host information.”

Graphic chart explains Azure Cosmos Database Change Feed Processor Library:

Graphics chart: Azure Cosmos Database Change Feed Processor Library

Microsoft Azure’s G/GS/LS/H/N-series, hyperscale CSP with graphic-intensive supportable Virtual Machine (VMs) are now available to customers in UK South.

The new NC and NV sizes, also known as GPU-enabled instances are specialized virtual machines that include NVIDIA’s GPU cards, optimized for different scenarios and use cases.

While, “NV sizes are optimized and designed for remote visualization, streaming, gaming, encoding, and VDI scenarios utilizing frameworks such as OpenGL and DirectX.” “The NC sizes are optimized for more compute-intensive and network-intensive applications and algorithms, including CUDA- and OpenCL-based applications and simulations.”

Azure N series VMs chart

Also, the new Azure G/GS series available today, are ideal for applications that demand faster CPUs, better local disk performance, or have higher memory demands. While, the “LS-series is optimized for workloads that require low latency local storage, like NoSQL databases (for example, Cassandra, MongoDB, Cloudera, and Redis).”

The new H series VMs are an excellent fit for compute-intensive workloads and provide cutting-edge performance, as well as an RDMA back-end network for MPI workloads.

In the August updates to the Azure Analysis Services web designer, which include a mix of fixes and new fucntionalities, like:

  • adding measures is a bit simpler with the use of a multiline code editor which recognizes DAX formula syntax.
  • model JSON editor now includes a mini document map on the right hand side to make browsing the JSON document simpler.
  • you can now use hierarchies and display folders when graphically designing queries.
  • you can now create new relationships or edit existing ones between table with the new relationship editor dialog.
  • when needing to connect to your server from other tools such as SSMS or SSDT, you can now simply copy your full server name from the server blade.

“Web designer, is a browser-based experience allow developers to start creating and managing Azure Analysis Services (AAS) semantic models quickly and easily.” While SQL Server Data Tools (SSDT) and SQL Server Management Studio (SSMS) are still the primary tools for development, “this new experience is intended to make simple changes fast and easy,” explains Microsoft.

Aimed to provide training to developers, Microsoft has launhced free online training for Azure Data Lake, that covers “all the topics a developer needs to know to start being productive with big data and how to address the challenges of authoring, debugging, and optimizing at scale.”

Explore the free training, at Microsoft Virtual Academy, watch the “Introduction to Azure Data Lake” in the video below:

Microsoft with Red Hat is providing a help to IT pros to run Red Hat based solutions on Microsoft Azure with best-in-class integrated support experience from both companies.

Today, IT departments are being asked to help their teams increase innovation while reducing costs, create environments to scale up and down flexibly and give their organizations the tools needed to meet increasingly global customer needs. To help accomplish these outcomes, many are moving their workloads to the cloud, while trying to figure out how to do so with minimum hassle or downtime. Moving Red Hat Enterprise Linux workloads (RHEL) to Microsoft Azure offers a unique combination of benefits, writes José Miguel Parrella.

Check out the webinar here (registration required).

Dynamics 365 for Customer Insights (DCI) is now offering a new Power BI connector with its July update, that allows aggregated and enriched data in DCI to be analyzed in Power BI, combining the analytic power and data visualization capabilities of the services.

To connect to DCI in Power BI Desktop, go to Get Data -> Online Services and select Dynamics 365 for Customer Insights (Beta) from Online Services list.

Dynamics 365 for Customer Insights (Beta)

Learn more about the ‘why’ behind update to Azure AD Conditional Access for Office.com

Update 08/05: In a post on Friday, Microsofts’ Identity Division, provided some background about an upcoming change that involve enforcing of conditional access policies to Office.com.

“a change will roll out that requires users to satisfy any policies set on Exchange Online and SharePoint Online when accessing Office.com, on August 24th,” writes Alex Simons. For example, “if a policy requiring multi-factor authentication (MFA) or a compliant device has been applied to SharePoint or Exchange, this policy will also apply to users signing into Office.com.”

The change addresses feedback Microsoft has received from customers who have noticed that “some features break in Office.com when a policy is applied to Exchange or SharePoint,” he writes. “These include searching for documents and email, loading your customizations in the app launcher, creating new documents and viewing your calendar.”

Those features access Exchange and SharePoint data, so they’re subject to Exchange and SharePoint policies. By requiring users to satisfy these policies when they access Office.com, users will have access to Exchange and SharePoint data, so these features will continue to work, Simons writes.

Get Latest News

Subscribe to Digital News Hub

Get our daily newsletter about the latest news in the industry.
First Name
Last Name
Email address
Secure and Spam free...