spark_auto_mapper_fhir.resources.value_set
¶
Module Contents¶
Classes¶
ValueSet |
- class spark_auto_mapper_fhir.resources.value_set.ValueSet(*, id_=None, meta=None, implicitRules=None, language=None, text=None, contained=None, extension=None, modifierExtension=None, url=None, identifier=None, version=None, name=None, title=None, status, experimental=None, date=None, publisher=None, contact=None, description=None, useContext=None, jurisdiction=None, immutable=None, purpose=None, copyright=None, compose=None, expansion=None)¶
Bases:
spark_auto_mapper_fhir.base_types.fhir_resource_base.FhirResourceBase
ValueSet valueset.xsd
A ValueSet resource instance specifies a set of codes drawn from one or more
code systems, intended for use in a particular context. Value sets link between [[[CodeSystem]]] definitions and their use in [coded elements](terminologies.html).
If the element is present, it must have either a @value, an @id, or extensions
A ValueSet resource instance specifies a set of codes drawn from one or more
code systems, intended for use in a particular context. Value sets link between [[[CodeSystem]]] definitions and their use in [coded elements](terminologies.html).
If the element is present, it must have either a @value, an @id, or extensions
- param id_
The logical id of the resource, as used in the URL for the resource. Once
- assigned, this value never changes.
- param meta
The metadata about the resource. This is content that is maintained by the
infrastructure. Changes to the content might not always be associated with version changes to the resource.
- param implicitRules
A reference to a set of rules that were followed when the resource was
constructed, and which must be understood when processing the content. Often, this is a reference to an implementation guide that defines the special rules along with other profiles etc.
- param language
The base language in which the resource is written.
- param text
A human-readable narrative that contains a summary of the resource and can be
used to represent the content of the resource to a human. The narrative need not encode all the structured data, but is required to contain sufficient detail to make it “clinically safe” for a human to just read the narrative. Resource definitions may define what content should be represented in the narrative to ensure clinical safety.
- param contained
These resources do not have an independent existence apart from the resource
that contains them - they cannot be identified independently, and nor can they have their own independent transaction scope.
- param extension
May be used to represent additional information that is not part of the basic
definition of the resource. To make the use of extensions safe and manageable, there is a strict set of governance applied to the definition and use of extensions. Though any implementer can define an extension, there is a set of requirements that SHALL be met as part of the definition of the extension.
- param modifierExtension
May be used to represent additional information that is not part of the basic
definition of the resource and that modifies the understanding of the element that contains it and/or the understanding of the containing element’s descendants. Usually modifier elements provide negation or qualification. To make the use of extensions safe and manageable, there is a strict set of governance applied to the definition and use of extensions. Though any implementer is allowed to define an extension, there is a set of requirements that SHALL be met as part of the definition of the extension. Applications processing a resource are required to check for modifier extensions.
Modifier extensions SHALL NOT change the meaning of any elements on Resource or DomainResource (including cannot change the meaning of modifierExtension itself).
- param url
An absolute URI that is used to identify this value set when it is referenced
in a specification, model, design or an instance; also called its canonical identifier. This SHOULD be globally unique and SHOULD be a literal address at which at which an authoritative instance of this value set is (or will be) published. This URL can be the target of a canonical reference. It SHALL remain the same when the value set is stored on different servers.
- param identifier
A formal identifier that is used to identify this value set when it is
represented in other formats, or referenced in a specification, model, design or an instance.
- param version
The identifier that is used to identify this version of the value set when it
is referenced in a specification, model, design or instance. This is an arbitrary value managed by the value set author and is not expected to be globally unique. For example, it might be a timestamp (e.g. yyyymmdd) if a managed version is not available. There is also no expectation that versions can be placed in a lexicographical sequence.
- param name
A natural language name identifying the value set. This name should be usable
as an identifier for the module by machine processing applications such as code generation.
- param title
A short, descriptive, user-friendly title for the value set.
- param status
The status of this value set. Enables tracking the life-cycle of the content.
The status of the value set applies to the value set definition (ValueSet.compose) and the associated ValueSet metadata. Expansions do not have a state.
- param experimental
A Boolean value to indicate that this value set is authored for testing
purposes (or education/evaluation/marketing) and is not intended to be used for genuine usage.
- param date
The date (and optionally time) when the value set was created or revised (e.g.
- the ‘content logical definition’).
- param publisher
The name of the organization or individual that published the value set.
- param contact
Contact details to assist a user in finding and communicating with the
- publisher.
- param description
A free text natural language description of the value set from a consumer’s
perspective. The textual description specifies the span of meanings for concepts to be included within the Value Set Expansion, and also may specify the intended use and limitations of the Value Set.
- param useContext
The content was developed with a focus and intent of supporting the contexts
that are listed. These contexts may be general categories (gender, age, …) or may be references to specific programs (insurance plans, studies, …) and may be used to assist with indexing and searching for appropriate value set instances.
- param jurisdiction
A legal or geographic region in which the value set is intended to be used.
- param immutable
If this is set to ‘true’, then no new versions of the content logical
- definition can be created. Note: Other metadata might still change.
- param purpose
Explanation of why this value set is needed and why it has been designed as it
- has.
- param copyright
A copyright statement relating to the value set and/or its contents. Copyright
statements are generally legal restrictions on the use and publishing of the value set.
- param compose
A set of criteria that define the contents of the value set by including or
excluding codes selected from the specified code system(s) that the value set draws from. This is also known as the Content Logical Definition (CLD).
- param expansion
A value set can also be “expanded”, where the value set is turned into a
simple collection of enumerated codes. This element holds the expansion, if it has been performed.
- Parameters
id_ (Optional[spark_auto_mapper_fhir.fhir_types.id.FhirId]) –
meta (Optional[spark_auto_mapper_fhir.complex_types.meta.Meta]) –
implicitRules (Optional[spark_auto_mapper_fhir.fhir_types.uri.FhirUri]) –
language (Optional[spark_auto_mapper_fhir.value_sets.common_languages.CommonLanguagesCode]) –
text (Optional[spark_auto_mapper_fhir.complex_types.narrative.Narrative]) –
contained (Optional[spark_auto_mapper_fhir.fhir_types.list.FhirList[spark_auto_mapper_fhir.complex_types.resource_container.ResourceContainer]]) –
extension (Optional[spark_auto_mapper_fhir.fhir_types.list.FhirList[spark_auto_mapper_fhir.extensions.extension_base.ExtensionBase]]) –
modifierExtension (Optional[spark_auto_mapper_fhir.fhir_types.list.FhirList[spark_auto_mapper_fhir.extensions.extension_base.ExtensionBase]]) –
url (Optional[spark_auto_mapper_fhir.fhir_types.uri.FhirUri]) –
identifier (Optional[spark_auto_mapper_fhir.fhir_types.list.FhirList[spark_auto_mapper_fhir.complex_types.identifier.Identifier]]) –
version (Optional[spark_auto_mapper_fhir.fhir_types.string.FhirString]) –
name (Optional[spark_auto_mapper_fhir.fhir_types.string.FhirString]) –
title (Optional[spark_auto_mapper_fhir.fhir_types.string.FhirString]) –
status (spark_auto_mapper_fhir.value_sets.publication_status.PublicationStatusCode) –
experimental (Optional[spark_auto_mapper_fhir.fhir_types.boolean.FhirBoolean]) –
date (Optional[spark_auto_mapper_fhir.fhir_types.date_time.FhirDateTime]) –
publisher (Optional[spark_auto_mapper_fhir.fhir_types.string.FhirString]) –
contact (Optional[spark_auto_mapper_fhir.fhir_types.list.FhirList[spark_auto_mapper_fhir.complex_types.contact_detail.ContactDetail]]) –
description (Optional[spark_auto_mapper_fhir.fhir_types.markdown.FhirMarkdown]) –
useContext (Optional[spark_auto_mapper_fhir.fhir_types.list.FhirList[spark_auto_mapper_fhir.complex_types.usage_context.UsageContext]]) –
jurisdiction (Optional[spark_auto_mapper_fhir.fhir_types.list.FhirList[spark_auto_mapper_fhir.complex_types.codeable_concept.CodeableConcept[spark_auto_mapper_fhir.value_sets.jurisdiction_value_set.JurisdictionValueSetCode]]]) –
immutable (Optional[spark_auto_mapper_fhir.fhir_types.boolean.FhirBoolean]) –
purpose (Optional[spark_auto_mapper_fhir.fhir_types.markdown.FhirMarkdown]) –
copyright (Optional[spark_auto_mapper_fhir.fhir_types.markdown.FhirMarkdown]) –
compose (Optional[spark_auto_mapper_fhir.backbone_elements.value_set_compose.ValueSetCompose]) –
expansion (Optional[spark_auto_mapper_fhir.backbone_elements.value_set_expansion.ValueSetExpansion]) –
- get_schema(self, include_extension)¶
- Parameters
include_extension (bool) –
- Return type
Optional[Union[pyspark.sql.types.StructType, pyspark.sql.types.DataType]]