Skip to content

Commit

Permalink
updated to 0.8.11 (#373)
Browse files Browse the repository at this point in the history
* updated to 0.8.11

* minor updates
  • Loading branch information
m-kovalsky authored Dec 19, 2024
1 parent 9abcb6e commit 3c9f579
Show file tree
Hide file tree
Showing 9 changed files with 17 additions and 19 deletions.
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Semantic Link Labs

[![PyPI version](https://badge.fury.io/py/semantic-link-labs.svg)](https://badge.fury.io/py/semantic-link-labs)
[![Read The Docs](https://readthedocs.org/projects/semantic-link-labs/badge/?version=0.8.10&style=flat)](https://readthedocs.org/projects/semantic-link-labs/)
[![Read The Docs](https://readthedocs.org/projects/semantic-link-labs/badge/?version=0.8.11&style=flat)](https://readthedocs.org/projects/semantic-link-labs/)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Downloads](https://static.pepy.tech/badge/semantic-link-labs)](https://pepy.tech/project/semantic-link-labs)

Expand Down Expand Up @@ -116,6 +116,7 @@ An even better way to ensure the semantic-link-labs library is available in your
2. Select your newly created environment within the 'Environment' drop down in the navigation bar at the top of the notebook

## Version History
* [0.8.11](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.11) (December 19, 2024)
* [0.8.10](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.10) (December 16, 2024)
* [0.8.9](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.9) (December 4, 2024)
* [0.8.8](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.8) (November 28, 2024)
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
project = 'semantic-link-labs'
copyright = '2024, Microsoft and community'
author = 'Microsoft and community'
release = '0.8.10'
release = '0.8.11'

# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ name="semantic-link-labs"
authors = [
{ name = "Microsoft Corporation" },
]
version="0.8.10"
version="0.8.11"
description="Semantic Link Labs for Microsoft Fabric"
readme="README.md"
requires-python=">=3.10,<3.12"
Expand Down
2 changes: 1 addition & 1 deletion src/sempy_labs/_connections.py
Original file line number Diff line number Diff line change
Expand Up @@ -485,7 +485,7 @@ def create_vnet_connection(
user_name: str,
password: str,
privacy_level: str,
connection_encryption: Optional[str] = "NotEncrypted",
connection_encryption: str = "NotEncrypted",
skip_test_connection: bool = False,
):
"""
Expand Down
4 changes: 2 additions & 2 deletions src/sempy_labs/_environments.py
Original file line number Diff line number Diff line change
Expand Up @@ -134,6 +134,8 @@ def publish_environment(environment: str, workspace: Optional[str | UUID] = None
"""
Publishes a Fabric environment.
This is a wrapper function for the following API: `Spark Libraries - Publish Environment <https://learn.microsoft.com/rest/api/fabric/environment/spark-libraries/publish-environment>`_.
Parameters
----------
environment: str
Expand All @@ -144,8 +146,6 @@ def publish_environment(environment: str, workspace: Optional[str | UUID] = None
or if no lakehouse attached, resolves to the workspace of the notebook.
"""

# https://learn.microsoft.com/en-us/rest/api/fabric/environment/spark-libraries/publish-environment?tabs=HTTP

from sempy_labs._helper_functions import resolve_environment_id

(workspace_name, workspace_id) = resolve_workspace_name_and_id(workspace)
Expand Down
2 changes: 1 addition & 1 deletion src/sempy_labs/_helper_functions.py
Original file line number Diff line number Diff line change
Expand Up @@ -306,7 +306,7 @@ def resolve_lakehouse_id(


def get_direct_lake_sql_endpoint(
dataset: str | UUID, workspace: Optional[str] = None
dataset: str | UUID, workspace: Optional[str | UUID] = None
) -> UUID:
"""
Obtains the SQL Endpoint ID of the semantic model.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ def update_direct_lake_model_connection(
dataset: str | UUID,
workspace: Optional[str | UUID] = None,
source: Optional[str] = None,
source_type: Optional[str] = "Lakehouse",
source_type: str = "Lakehouse",
source_workspace: Optional[str | UUID] = None,
):
"""
Expand Down
11 changes: 4 additions & 7 deletions src/sempy_labs/directlake/_update_directlake_partition_entity.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,13 @@ def update_direct_lake_partition_entity(
Parameters
----------
dataset : str | UUID
dataset : str | uuid.UUID
Name or ID of the semantic model.
table_name : str, List[str]
Name of the table(s) in the semantic model.
entity_name : str, List[str]
Name of the lakehouse table to be mapped to the semantic model table.
workspace : str | UUID, default=None
workspace : str | uuid.UUID, default=None
The Fabric workspace name or ID in which the semantic model exists.
Defaults to None which resolves to the workspace of the attached lakehouse
or if no lakehouse attached, resolves to the workspace of the notebook.
Expand Down Expand Up @@ -94,21 +94,18 @@ def add_table_to_direct_lake_semantic_model(
Parameters
----------
dataset : str | UUID
dataset : str | uuid.UUID
Name or ID of the semantic model.
table_name : str, List[str]
Name of the table in the semantic model.
lakehouse_table_name : str
The name of the Fabric lakehouse table.
refresh : bool, default=True
Refreshes the table after it is added to the semantic model.
workspace : str | UUID, default=None
workspace : str | uuid.UUID, default=None
The name or ID of the Fabric workspace in which the semantic model resides.
Defaults to None which resolves to the workspace of the attached lakehouse
or if no lakehouse attached, resolves to the workspace of the notebook.
Returns
-------
"""

sempy.fabric._client._utils._init_analysis_services()
Expand Down
8 changes: 4 additions & 4 deletions src/sempy_labs/directlake/_warm_cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,13 +28,13 @@ def warm_direct_lake_cache_perspective(
Parameters
----------
dataset : str | UUID
dataset : str | uuid.UUID
Name or ID of the semantic model.
perspective : str
Name of the perspective which contains objects to be used for warming the cache.
add_dependencies : bool, default=False
Includes object dependencies in the cache warming process.
workspace : str | UUID, default=None
workspace : str | uuid.UUID, default=None
The Fabric workspace name or ID.
Defaults to None which resolves to the workspace of the attached lakehouse
or if no lakehouse attached, resolves to the workspace of the notebook.
Expand Down Expand Up @@ -138,9 +138,9 @@ def warm_direct_lake_cache_isresident(
Parameters
----------
dataset : str | UUID
dataset : str | uuid.UUID
Name or ID of the semantic model.
workspace : str | UUID, default=None
workspace : str | uuid.UUID, default=None
The Fabric workspace name or ID.
Defaults to None which resolves to the workspace of the attached lakehouse
or if no lakehouse attached, resolves to the workspace of the notebook.
Expand Down

0 comments on commit 3c9f579

Please sign in to comment.