published on Thursday, Mar 19, 2026 by Pulumi
published on Thursday, Mar 19, 2026 by Pulumi
Create FeatureEngineeringFeature Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new FeatureEngineeringFeature(name: string, args: FeatureEngineeringFeatureArgs, opts?: CustomResourceOptions);@overload
def FeatureEngineeringFeature(resource_name: str,
args: FeatureEngineeringFeatureArgs,
opts: Optional[ResourceOptions] = None)
@overload
def FeatureEngineeringFeature(resource_name: str,
opts: Optional[ResourceOptions] = None,
full_name: Optional[str] = None,
function: Optional[FeatureEngineeringFeatureFunctionArgs] = None,
source: Optional[FeatureEngineeringFeatureSourceArgs] = None,
description: Optional[str] = None,
entities: Optional[Sequence[FeatureEngineeringFeatureEntityArgs]] = None,
filter_condition: Optional[str] = None,
inputs: Optional[Sequence[str]] = None,
lineage_context: Optional[FeatureEngineeringFeatureLineageContextArgs] = None,
provider_config: Optional[FeatureEngineeringFeatureProviderConfigArgs] = None,
time_window: Optional[FeatureEngineeringFeatureTimeWindowArgs] = None,
timeseries_column: Optional[FeatureEngineeringFeatureTimeseriesColumnArgs] = None)func NewFeatureEngineeringFeature(ctx *Context, name string, args FeatureEngineeringFeatureArgs, opts ...ResourceOption) (*FeatureEngineeringFeature, error)public FeatureEngineeringFeature(string name, FeatureEngineeringFeatureArgs args, CustomResourceOptions? opts = null)
public FeatureEngineeringFeature(String name, FeatureEngineeringFeatureArgs args)
public FeatureEngineeringFeature(String name, FeatureEngineeringFeatureArgs args, CustomResourceOptions options)
type: databricks:FeatureEngineeringFeature
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args FeatureEngineeringFeatureArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args FeatureEngineeringFeatureArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args FeatureEngineeringFeatureArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args FeatureEngineeringFeatureArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args FeatureEngineeringFeatureArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Constructor example
The following reference example uses placeholder values for all input properties.
var featureEngineeringFeatureResource = new Databricks.FeatureEngineeringFeature("featureEngineeringFeatureResource", new()
{
FullName = "string",
Function = new Databricks.Inputs.FeatureEngineeringFeatureFunctionArgs
{
AggregationFunction = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionArgs
{
ApproxCountDistinct = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinctArgs
{
Input = "string",
RelativeSd = 0,
},
ApproxPercentile = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentileArgs
{
Input = "string",
Percentile = 0,
Accuracy = 0,
},
Avg = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionAvgArgs
{
Input = "string",
},
CountFunction = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionCountFunctionArgs
{
Input = "string",
},
First = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionFirstArgs
{
Input = "string",
},
Last = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionLastArgs
{
Input = "string",
},
Max = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionMaxArgs
{
Input = "string",
},
Min = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionMinArgs
{
Input = "string",
},
StddevPop = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionStddevPopArgs
{
Input = "string",
},
StddevSamp = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionStddevSampArgs
{
Input = "string",
},
Sum = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionSumArgs
{
Input = "string",
},
TimeWindow = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowArgs
{
Continuous = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuousArgs
{
WindowDuration = "string",
Offset = "string",
},
Sliding = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSlidingArgs
{
SlideDuration = "string",
WindowDuration = "string",
},
Tumbling = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumblingArgs
{
WindowDuration = "string",
},
},
VarPop = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionVarPopArgs
{
Input = "string",
},
VarSamp = new Databricks.Inputs.FeatureEngineeringFeatureFunctionAggregationFunctionVarSampArgs
{
Input = "string",
},
},
ExtraParameters = new[]
{
new Databricks.Inputs.FeatureEngineeringFeatureFunctionExtraParameterArgs
{
Key = "string",
Value = "string",
},
},
FunctionType = "string",
},
Source = new Databricks.Inputs.FeatureEngineeringFeatureSourceArgs
{
DeltaTableSource = new Databricks.Inputs.FeatureEngineeringFeatureSourceDeltaTableSourceArgs
{
FullName = "string",
DataframeSchema = "string",
EntityColumns = new[]
{
"string",
},
FilterCondition = "string",
TimeseriesColumn = "string",
TransformationSql = "string",
},
KafkaSource = new Databricks.Inputs.FeatureEngineeringFeatureSourceKafkaSourceArgs
{
Name = "string",
EntityColumnIdentifiers = new[]
{
new Databricks.Inputs.FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArgs
{
VariantExprPath = "string",
},
},
FilterCondition = "string",
TimeseriesColumnIdentifier = new Databricks.Inputs.FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifierArgs
{
VariantExprPath = "string",
},
},
},
Description = "string",
Entities = new[]
{
new Databricks.Inputs.FeatureEngineeringFeatureEntityArgs
{
Name = "string",
},
},
FilterCondition = "string",
Inputs = new[]
{
"string",
},
LineageContext = new Databricks.Inputs.FeatureEngineeringFeatureLineageContextArgs
{
JobContext = new Databricks.Inputs.FeatureEngineeringFeatureLineageContextJobContextArgs
{
JobId = 0,
JobRunId = 0,
},
NotebookId = 0,
},
ProviderConfig = new Databricks.Inputs.FeatureEngineeringFeatureProviderConfigArgs
{
WorkspaceId = "string",
},
TimeWindow = new Databricks.Inputs.FeatureEngineeringFeatureTimeWindowArgs
{
Continuous = new Databricks.Inputs.FeatureEngineeringFeatureTimeWindowContinuousArgs
{
WindowDuration = "string",
Offset = "string",
},
Sliding = new Databricks.Inputs.FeatureEngineeringFeatureTimeWindowSlidingArgs
{
SlideDuration = "string",
WindowDuration = "string",
},
Tumbling = new Databricks.Inputs.FeatureEngineeringFeatureTimeWindowTumblingArgs
{
WindowDuration = "string",
},
},
TimeseriesColumn = new Databricks.Inputs.FeatureEngineeringFeatureTimeseriesColumnArgs
{
Name = "string",
},
});
example, err := databricks.NewFeatureEngineeringFeature(ctx, "featureEngineeringFeatureResource", &databricks.FeatureEngineeringFeatureArgs{
FullName: pulumi.String("string"),
Function: &databricks.FeatureEngineeringFeatureFunctionArgs{
AggregationFunction: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionArgs{
ApproxCountDistinct: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinctArgs{
Input: pulumi.String("string"),
RelativeSd: pulumi.Float64(0),
},
ApproxPercentile: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentileArgs{
Input: pulumi.String("string"),
Percentile: pulumi.Float64(0),
Accuracy: pulumi.Int(0),
},
Avg: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionAvgArgs{
Input: pulumi.String("string"),
},
CountFunction: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionCountFunctionArgs{
Input: pulumi.String("string"),
},
First: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionFirstArgs{
Input: pulumi.String("string"),
},
Last: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionLastArgs{
Input: pulumi.String("string"),
},
Max: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionMaxArgs{
Input: pulumi.String("string"),
},
Min: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionMinArgs{
Input: pulumi.String("string"),
},
StddevPop: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionStddevPopArgs{
Input: pulumi.String("string"),
},
StddevSamp: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionStddevSampArgs{
Input: pulumi.String("string"),
},
Sum: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionSumArgs{
Input: pulumi.String("string"),
},
TimeWindow: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowArgs{
Continuous: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuousArgs{
WindowDuration: pulumi.String("string"),
Offset: pulumi.String("string"),
},
Sliding: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSlidingArgs{
SlideDuration: pulumi.String("string"),
WindowDuration: pulumi.String("string"),
},
Tumbling: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumblingArgs{
WindowDuration: pulumi.String("string"),
},
},
VarPop: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionVarPopArgs{
Input: pulumi.String("string"),
},
VarSamp: &databricks.FeatureEngineeringFeatureFunctionAggregationFunctionVarSampArgs{
Input: pulumi.String("string"),
},
},
ExtraParameters: databricks.FeatureEngineeringFeatureFunctionExtraParameterArray{
&databricks.FeatureEngineeringFeatureFunctionExtraParameterArgs{
Key: pulumi.String("string"),
Value: pulumi.String("string"),
},
},
FunctionType: pulumi.String("string"),
},
Source: &databricks.FeatureEngineeringFeatureSourceArgs{
DeltaTableSource: &databricks.FeatureEngineeringFeatureSourceDeltaTableSourceArgs{
FullName: pulumi.String("string"),
DataframeSchema: pulumi.String("string"),
EntityColumns: pulumi.StringArray{
pulumi.String("string"),
},
FilterCondition: pulumi.String("string"),
TimeseriesColumn: pulumi.String("string"),
TransformationSql: pulumi.String("string"),
},
KafkaSource: &databricks.FeatureEngineeringFeatureSourceKafkaSourceArgs{
Name: pulumi.String("string"),
EntityColumnIdentifiers: databricks.FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArray{
&databricks.FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArgs{
VariantExprPath: pulumi.String("string"),
},
},
FilterCondition: pulumi.String("string"),
TimeseriesColumnIdentifier: &databricks.FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifierArgs{
VariantExprPath: pulumi.String("string"),
},
},
},
Description: pulumi.String("string"),
Entities: databricks.FeatureEngineeringFeatureEntityArray{
&databricks.FeatureEngineeringFeatureEntityArgs{
Name: pulumi.String("string"),
},
},
FilterCondition: pulumi.String("string"),
Inputs: pulumi.StringArray{
pulumi.String("string"),
},
LineageContext: &databricks.FeatureEngineeringFeatureLineageContextArgs{
JobContext: &databricks.FeatureEngineeringFeatureLineageContextJobContextArgs{
JobId: pulumi.Int(0),
JobRunId: pulumi.Int(0),
},
NotebookId: pulumi.Int(0),
},
ProviderConfig: &databricks.FeatureEngineeringFeatureProviderConfigArgs{
WorkspaceId: pulumi.String("string"),
},
TimeWindow: &databricks.FeatureEngineeringFeatureTimeWindowArgs{
Continuous: &databricks.FeatureEngineeringFeatureTimeWindowContinuousArgs{
WindowDuration: pulumi.String("string"),
Offset: pulumi.String("string"),
},
Sliding: &databricks.FeatureEngineeringFeatureTimeWindowSlidingArgs{
SlideDuration: pulumi.String("string"),
WindowDuration: pulumi.String("string"),
},
Tumbling: &databricks.FeatureEngineeringFeatureTimeWindowTumblingArgs{
WindowDuration: pulumi.String("string"),
},
},
TimeseriesColumn: &databricks.FeatureEngineeringFeatureTimeseriesColumnArgs{
Name: pulumi.String("string"),
},
})
var featureEngineeringFeatureResource = new FeatureEngineeringFeature("featureEngineeringFeatureResource", FeatureEngineeringFeatureArgs.builder()
.fullName("string")
.function(FeatureEngineeringFeatureFunctionArgs.builder()
.aggregationFunction(FeatureEngineeringFeatureFunctionAggregationFunctionArgs.builder()
.approxCountDistinct(FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinctArgs.builder()
.input("string")
.relativeSd(0.0)
.build())
.approxPercentile(FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentileArgs.builder()
.input("string")
.percentile(0.0)
.accuracy(0)
.build())
.avg(FeatureEngineeringFeatureFunctionAggregationFunctionAvgArgs.builder()
.input("string")
.build())
.countFunction(FeatureEngineeringFeatureFunctionAggregationFunctionCountFunctionArgs.builder()
.input("string")
.build())
.first(FeatureEngineeringFeatureFunctionAggregationFunctionFirstArgs.builder()
.input("string")
.build())
.last(FeatureEngineeringFeatureFunctionAggregationFunctionLastArgs.builder()
.input("string")
.build())
.max(FeatureEngineeringFeatureFunctionAggregationFunctionMaxArgs.builder()
.input("string")
.build())
.min(FeatureEngineeringFeatureFunctionAggregationFunctionMinArgs.builder()
.input("string")
.build())
.stddevPop(FeatureEngineeringFeatureFunctionAggregationFunctionStddevPopArgs.builder()
.input("string")
.build())
.stddevSamp(FeatureEngineeringFeatureFunctionAggregationFunctionStddevSampArgs.builder()
.input("string")
.build())
.sum(FeatureEngineeringFeatureFunctionAggregationFunctionSumArgs.builder()
.input("string")
.build())
.timeWindow(FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowArgs.builder()
.continuous(FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuousArgs.builder()
.windowDuration("string")
.offset("string")
.build())
.sliding(FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSlidingArgs.builder()
.slideDuration("string")
.windowDuration("string")
.build())
.tumbling(FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumblingArgs.builder()
.windowDuration("string")
.build())
.build())
.varPop(FeatureEngineeringFeatureFunctionAggregationFunctionVarPopArgs.builder()
.input("string")
.build())
.varSamp(FeatureEngineeringFeatureFunctionAggregationFunctionVarSampArgs.builder()
.input("string")
.build())
.build())
.extraParameters(FeatureEngineeringFeatureFunctionExtraParameterArgs.builder()
.key("string")
.value("string")
.build())
.functionType("string")
.build())
.source(FeatureEngineeringFeatureSourceArgs.builder()
.deltaTableSource(FeatureEngineeringFeatureSourceDeltaTableSourceArgs.builder()
.fullName("string")
.dataframeSchema("string")
.entityColumns("string")
.filterCondition("string")
.timeseriesColumn("string")
.transformationSql("string")
.build())
.kafkaSource(FeatureEngineeringFeatureSourceKafkaSourceArgs.builder()
.name("string")
.entityColumnIdentifiers(FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArgs.builder()
.variantExprPath("string")
.build())
.filterCondition("string")
.timeseriesColumnIdentifier(FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifierArgs.builder()
.variantExprPath("string")
.build())
.build())
.build())
.description("string")
.entities(FeatureEngineeringFeatureEntityArgs.builder()
.name("string")
.build())
.filterCondition("string")
.inputs("string")
.lineageContext(FeatureEngineeringFeatureLineageContextArgs.builder()
.jobContext(FeatureEngineeringFeatureLineageContextJobContextArgs.builder()
.jobId(0)
.jobRunId(0)
.build())
.notebookId(0)
.build())
.providerConfig(FeatureEngineeringFeatureProviderConfigArgs.builder()
.workspaceId("string")
.build())
.timeWindow(FeatureEngineeringFeatureTimeWindowArgs.builder()
.continuous(FeatureEngineeringFeatureTimeWindowContinuousArgs.builder()
.windowDuration("string")
.offset("string")
.build())
.sliding(FeatureEngineeringFeatureTimeWindowSlidingArgs.builder()
.slideDuration("string")
.windowDuration("string")
.build())
.tumbling(FeatureEngineeringFeatureTimeWindowTumblingArgs.builder()
.windowDuration("string")
.build())
.build())
.timeseriesColumn(FeatureEngineeringFeatureTimeseriesColumnArgs.builder()
.name("string")
.build())
.build());
feature_engineering_feature_resource = databricks.FeatureEngineeringFeature("featureEngineeringFeatureResource",
full_name="string",
function={
"aggregation_function": {
"approx_count_distinct": {
"input": "string",
"relative_sd": 0,
},
"approx_percentile": {
"input": "string",
"percentile": 0,
"accuracy": 0,
},
"avg": {
"input": "string",
},
"count_function": {
"input": "string",
},
"first": {
"input": "string",
},
"last": {
"input": "string",
},
"max": {
"input": "string",
},
"min": {
"input": "string",
},
"stddev_pop": {
"input": "string",
},
"stddev_samp": {
"input": "string",
},
"sum": {
"input": "string",
},
"time_window": {
"continuous": {
"window_duration": "string",
"offset": "string",
},
"sliding": {
"slide_duration": "string",
"window_duration": "string",
},
"tumbling": {
"window_duration": "string",
},
},
"var_pop": {
"input": "string",
},
"var_samp": {
"input": "string",
},
},
"extra_parameters": [{
"key": "string",
"value": "string",
}],
"function_type": "string",
},
source={
"delta_table_source": {
"full_name": "string",
"dataframe_schema": "string",
"entity_columns": ["string"],
"filter_condition": "string",
"timeseries_column": "string",
"transformation_sql": "string",
},
"kafka_source": {
"name": "string",
"entity_column_identifiers": [{
"variant_expr_path": "string",
}],
"filter_condition": "string",
"timeseries_column_identifier": {
"variant_expr_path": "string",
},
},
},
description="string",
entities=[{
"name": "string",
}],
filter_condition="string",
inputs=["string"],
lineage_context={
"job_context": {
"job_id": 0,
"job_run_id": 0,
},
"notebook_id": 0,
},
provider_config={
"workspace_id": "string",
},
time_window={
"continuous": {
"window_duration": "string",
"offset": "string",
},
"sliding": {
"slide_duration": "string",
"window_duration": "string",
},
"tumbling": {
"window_duration": "string",
},
},
timeseries_column={
"name": "string",
})
const featureEngineeringFeatureResource = new databricks.FeatureEngineeringFeature("featureEngineeringFeatureResource", {
fullName: "string",
"function": {
aggregationFunction: {
approxCountDistinct: {
input: "string",
relativeSd: 0,
},
approxPercentile: {
input: "string",
percentile: 0,
accuracy: 0,
},
avg: {
input: "string",
},
countFunction: {
input: "string",
},
first: {
input: "string",
},
last: {
input: "string",
},
max: {
input: "string",
},
min: {
input: "string",
},
stddevPop: {
input: "string",
},
stddevSamp: {
input: "string",
},
sum: {
input: "string",
},
timeWindow: {
continuous: {
windowDuration: "string",
offset: "string",
},
sliding: {
slideDuration: "string",
windowDuration: "string",
},
tumbling: {
windowDuration: "string",
},
},
varPop: {
input: "string",
},
varSamp: {
input: "string",
},
},
extraParameters: [{
key: "string",
value: "string",
}],
functionType: "string",
},
source: {
deltaTableSource: {
fullName: "string",
dataframeSchema: "string",
entityColumns: ["string"],
filterCondition: "string",
timeseriesColumn: "string",
transformationSql: "string",
},
kafkaSource: {
name: "string",
entityColumnIdentifiers: [{
variantExprPath: "string",
}],
filterCondition: "string",
timeseriesColumnIdentifier: {
variantExprPath: "string",
},
},
},
description: "string",
entities: [{
name: "string",
}],
filterCondition: "string",
inputs: ["string"],
lineageContext: {
jobContext: {
jobId: 0,
jobRunId: 0,
},
notebookId: 0,
},
providerConfig: {
workspaceId: "string",
},
timeWindow: {
continuous: {
windowDuration: "string",
offset: "string",
},
sliding: {
slideDuration: "string",
windowDuration: "string",
},
tumbling: {
windowDuration: "string",
},
},
timeseriesColumn: {
name: "string",
},
});
type: databricks:FeatureEngineeringFeature
properties:
description: string
entities:
- name: string
filterCondition: string
fullName: string
function:
aggregationFunction:
approxCountDistinct:
input: string
relativeSd: 0
approxPercentile:
accuracy: 0
input: string
percentile: 0
avg:
input: string
countFunction:
input: string
first:
input: string
last:
input: string
max:
input: string
min:
input: string
stddevPop:
input: string
stddevSamp:
input: string
sum:
input: string
timeWindow:
continuous:
offset: string
windowDuration: string
sliding:
slideDuration: string
windowDuration: string
tumbling:
windowDuration: string
varPop:
input: string
varSamp:
input: string
extraParameters:
- key: string
value: string
functionType: string
inputs:
- string
lineageContext:
jobContext:
jobId: 0
jobRunId: 0
notebookId: 0
providerConfig:
workspaceId: string
source:
deltaTableSource:
dataframeSchema: string
entityColumns:
- string
filterCondition: string
fullName: string
timeseriesColumn: string
transformationSql: string
kafkaSource:
entityColumnIdentifiers:
- variantExprPath: string
filterCondition: string
name: string
timeseriesColumnIdentifier:
variantExprPath: string
timeWindow:
continuous:
offset: string
windowDuration: string
sliding:
slideDuration: string
windowDuration: string
tumbling:
windowDuration: string
timeseriesColumn:
name: string
FeatureEngineeringFeature Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The FeatureEngineeringFeature resource accepts the following input properties:
- Full
Name string - The full three-part name (catalog, schema, name) of the feature
- Function
Feature
Engineering Feature Function - The function by which the feature is computed
- Source
Feature
Engineering Feature Source - The data source of the feature
- Description string
- The description of the feature
- Entities
List<Feature
Engineering Feature Entity> - The entity columns for the feature, used as aggregation keys and for query-time lookup
- Filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- Inputs List<string>
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- Lineage
Context FeatureEngineering Feature Lineage Context - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- Provider
Config FeatureEngineering Feature Provider Config - Configure the provider for management through account provider.
- Time
Window FeatureEngineering Feature Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- Timeseries
Column FeatureEngineering Feature Timeseries Column - Column recording time, used for point-in-time joins, backfills, and aggregations
- Full
Name string - The full three-part name (catalog, schema, name) of the feature
- Function
Feature
Engineering Feature Function Args - The function by which the feature is computed
- Source
Feature
Engineering Feature Source Args - The data source of the feature
- Description string
- The description of the feature
- Entities
[]Feature
Engineering Feature Entity Args - The entity columns for the feature, used as aggregation keys and for query-time lookup
- Filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- Inputs []string
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- Lineage
Context FeatureEngineering Feature Lineage Context Args - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- Provider
Config FeatureEngineering Feature Provider Config Args - Configure the provider for management through account provider.
- Time
Window FeatureEngineering Feature Time Window Args - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- Timeseries
Column FeatureEngineering Feature Timeseries Column Args - Column recording time, used for point-in-time joins, backfills, and aggregations
- full
Name String - The full three-part name (catalog, schema, name) of the feature
- function
Feature
Engineering Feature Function - The function by which the feature is computed
- source
Feature
Engineering Feature Source - The data source of the feature
- description String
- The description of the feature
- entities
List<Feature
Engineering Feature Entity> - The entity columns for the feature, used as aggregation keys and for query-time lookup
- filter
Condition String - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- inputs List<String>
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- lineage
Context FeatureEngineering Feature Lineage Context - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- provider
Config FeatureEngineering Feature Provider Config - Configure the provider for management through account provider.
- time
Window FeatureEngineering Feature Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- timeseries
Column FeatureEngineering Feature Timeseries Column - Column recording time, used for point-in-time joins, backfills, and aggregations
- full
Name string - The full three-part name (catalog, schema, name) of the feature
- function
Feature
Engineering Feature Function - The function by which the feature is computed
- source
Feature
Engineering Feature Source - The data source of the feature
- description string
- The description of the feature
- entities
Feature
Engineering Feature Entity[] - The entity columns for the feature, used as aggregation keys and for query-time lookup
- filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- inputs string[]
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- lineage
Context FeatureEngineering Feature Lineage Context - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- provider
Config FeatureEngineering Feature Provider Config - Configure the provider for management through account provider.
- time
Window FeatureEngineering Feature Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- timeseries
Column FeatureEngineering Feature Timeseries Column - Column recording time, used for point-in-time joins, backfills, and aggregations
- full_
name str - The full three-part name (catalog, schema, name) of the feature
- function
Feature
Engineering Feature Function Args - The function by which the feature is computed
- source
Feature
Engineering Feature Source Args - The data source of the feature
- description str
- The description of the feature
- entities
Sequence[Feature
Engineering Feature Entity Args] - The entity columns for the feature, used as aggregation keys and for query-time lookup
- filter_
condition str - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- inputs Sequence[str]
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- lineage_
context FeatureEngineering Feature Lineage Context Args - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- provider_
config FeatureEngineering Feature Provider Config Args - Configure the provider for management through account provider.
- time_
window FeatureEngineering Feature Time Window Args - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- timeseries_
column FeatureEngineering Feature Timeseries Column Args - Column recording time, used for point-in-time joins, backfills, and aggregations
- full
Name String - The full three-part name (catalog, schema, name) of the feature
- function Property Map
- The function by which the feature is computed
- source Property Map
- The data source of the feature
- description String
- The description of the feature
- entities List<Property Map>
- The entity columns for the feature, used as aggregation keys and for query-time lookup
- filter
Condition String - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- inputs List<String>
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- lineage
Context Property Map - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- provider
Config Property Map - Configure the provider for management through account provider.
- time
Window Property Map - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- timeseries
Column Property Map - Column recording time, used for point-in-time joins, backfills, and aggregations
Outputs
All input properties are implicitly available as output properties. Additionally, the FeatureEngineeringFeature resource produces the following output properties:
- Id string
- The provider-assigned unique ID for this managed resource.
- Id string
- The provider-assigned unique ID for this managed resource.
- id String
- The provider-assigned unique ID for this managed resource.
- id string
- The provider-assigned unique ID for this managed resource.
- id str
- The provider-assigned unique ID for this managed resource.
- id String
- The provider-assigned unique ID for this managed resource.
Look up Existing FeatureEngineeringFeature Resource
Get an existing FeatureEngineeringFeature resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: FeatureEngineeringFeatureState, opts?: CustomResourceOptions): FeatureEngineeringFeature@staticmethod
def get(resource_name: str,
id: str,
opts: Optional[ResourceOptions] = None,
description: Optional[str] = None,
entities: Optional[Sequence[FeatureEngineeringFeatureEntityArgs]] = None,
filter_condition: Optional[str] = None,
full_name: Optional[str] = None,
function: Optional[FeatureEngineeringFeatureFunctionArgs] = None,
inputs: Optional[Sequence[str]] = None,
lineage_context: Optional[FeatureEngineeringFeatureLineageContextArgs] = None,
provider_config: Optional[FeatureEngineeringFeatureProviderConfigArgs] = None,
source: Optional[FeatureEngineeringFeatureSourceArgs] = None,
time_window: Optional[FeatureEngineeringFeatureTimeWindowArgs] = None,
timeseries_column: Optional[FeatureEngineeringFeatureTimeseriesColumnArgs] = None) -> FeatureEngineeringFeaturefunc GetFeatureEngineeringFeature(ctx *Context, name string, id IDInput, state *FeatureEngineeringFeatureState, opts ...ResourceOption) (*FeatureEngineeringFeature, error)public static FeatureEngineeringFeature Get(string name, Input<string> id, FeatureEngineeringFeatureState? state, CustomResourceOptions? opts = null)public static FeatureEngineeringFeature get(String name, Output<String> id, FeatureEngineeringFeatureState state, CustomResourceOptions options)resources: _: type: databricks:FeatureEngineeringFeature get: id: ${id}- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- Description string
- The description of the feature
- Entities
List<Feature
Engineering Feature Entity> - The entity columns for the feature, used as aggregation keys and for query-time lookup
- Filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- Full
Name string - The full three-part name (catalog, schema, name) of the feature
- Function
Feature
Engineering Feature Function - The function by which the feature is computed
- Inputs List<string>
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- Lineage
Context FeatureEngineering Feature Lineage Context - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- Provider
Config FeatureEngineering Feature Provider Config - Configure the provider for management through account provider.
- Source
Feature
Engineering Feature Source - The data source of the feature
- Time
Window FeatureEngineering Feature Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- Timeseries
Column FeatureEngineering Feature Timeseries Column - Column recording time, used for point-in-time joins, backfills, and aggregations
- Description string
- The description of the feature
- Entities
[]Feature
Engineering Feature Entity Args - The entity columns for the feature, used as aggregation keys and for query-time lookup
- Filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- Full
Name string - The full three-part name (catalog, schema, name) of the feature
- Function
Feature
Engineering Feature Function Args - The function by which the feature is computed
- Inputs []string
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- Lineage
Context FeatureEngineering Feature Lineage Context Args - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- Provider
Config FeatureEngineering Feature Provider Config Args - Configure the provider for management through account provider.
- Source
Feature
Engineering Feature Source Args - The data source of the feature
- Time
Window FeatureEngineering Feature Time Window Args - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- Timeseries
Column FeatureEngineering Feature Timeseries Column Args - Column recording time, used for point-in-time joins, backfills, and aggregations
- description String
- The description of the feature
- entities
List<Feature
Engineering Feature Entity> - The entity columns for the feature, used as aggregation keys and for query-time lookup
- filter
Condition String - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- full
Name String - The full three-part name (catalog, schema, name) of the feature
- function
Feature
Engineering Feature Function - The function by which the feature is computed
- inputs List<String>
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- lineage
Context FeatureEngineering Feature Lineage Context - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- provider
Config FeatureEngineering Feature Provider Config - Configure the provider for management through account provider.
- source
Feature
Engineering Feature Source - The data source of the feature
- time
Window FeatureEngineering Feature Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- timeseries
Column FeatureEngineering Feature Timeseries Column - Column recording time, used for point-in-time joins, backfills, and aggregations
- description string
- The description of the feature
- entities
Feature
Engineering Feature Entity[] - The entity columns for the feature, used as aggregation keys and for query-time lookup
- filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- full
Name string - The full three-part name (catalog, schema, name) of the feature
- function
Feature
Engineering Feature Function - The function by which the feature is computed
- inputs string[]
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- lineage
Context FeatureEngineering Feature Lineage Context - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- provider
Config FeatureEngineering Feature Provider Config - Configure the provider for management through account provider.
- source
Feature
Engineering Feature Source - The data source of the feature
- time
Window FeatureEngineering Feature Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- timeseries
Column FeatureEngineering Feature Timeseries Column - Column recording time, used for point-in-time joins, backfills, and aggregations
- description str
- The description of the feature
- entities
Sequence[Feature
Engineering Feature Entity Args] - The entity columns for the feature, used as aggregation keys and for query-time lookup
- filter_
condition str - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- full_
name str - The full three-part name (catalog, schema, name) of the feature
- function
Feature
Engineering Feature Function Args - The function by which the feature is computed
- inputs Sequence[str]
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- lineage_
context FeatureEngineering Feature Lineage Context Args - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- provider_
config FeatureEngineering Feature Provider Config Args - Configure the provider for management through account provider.
- source
Feature
Engineering Feature Source Args - The data source of the feature
- time_
window FeatureEngineering Feature Time Window Args - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- timeseries_
column FeatureEngineering Feature Timeseries Column Args - Column recording time, used for point-in-time joins, backfills, and aggregations
- description String
- The description of the feature
- entities List<Property Map>
- The entity columns for the feature, used as aggregation keys and for query-time lookup
- filter
Condition String - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- full
Name String - The full three-part name (catalog, schema, name) of the feature
- function Property Map
- The function by which the feature is computed
- inputs List<String>
- Deprecated: Use AggregationFunction.inputs instead. Kept for backwards compatibility. The input columns from which the feature is computed
- lineage
Context Property Map - Lineage context information for this feature. WARNING: This field is primarily intended for internal use by Databricks systems and is automatically populated when features are created through Databricks notebooks or jobs. Users should not manually set this field as incorrect values may lead to inaccurate lineage tracking or unexpected behavior. This field will be set by feature-engineering client and should be left unset by SDK and terraform users
- provider
Config Property Map - Configure the provider for management through account provider.
- source Property Map
- The data source of the feature
- time
Window Property Map - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- timeseries
Column Property Map - Column recording time, used for point-in-time joins, backfills, and aggregations
Supporting Types
FeatureEngineeringFeatureEntity, FeatureEngineeringFeatureEntityArgs
- Name string
- Name string
- name String
- name string
- name str
- name String
FeatureEngineeringFeatureFunction, FeatureEngineeringFeatureFunctionArgs
- Aggregation
Function FeatureEngineering Feature Function Aggregation Function - An aggregation function applied over a time window
- Extra
Parameters List<FeatureEngineering Feature Function Extra Parameter> - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
- Function
Type string - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility.
The type of the function. Possible values are:
APPROX_COUNT_DISTINCT,APPROX_PERCENTILE,AVG,COUNT,FIRST,LAST,MAX,MIN,STDDEV_POP,STDDEV_SAMP,SUM,VAR_POP,VAR_SAMP
- Aggregation
Function FeatureEngineering Feature Function Aggregation Function - An aggregation function applied over a time window
- Extra
Parameters []FeatureEngineering Feature Function Extra Parameter - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
- Function
Type string - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility.
The type of the function. Possible values are:
APPROX_COUNT_DISTINCT,APPROX_PERCENTILE,AVG,COUNT,FIRST,LAST,MAX,MIN,STDDEV_POP,STDDEV_SAMP,SUM,VAR_POP,VAR_SAMP
- aggregation
Function FeatureEngineering Feature Function Aggregation Function - An aggregation function applied over a time window
- extra
Parameters List<FeatureEngineering Feature Function Extra Parameter> - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
- function
Type String - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility.
The type of the function. Possible values are:
APPROX_COUNT_DISTINCT,APPROX_PERCENTILE,AVG,COUNT,FIRST,LAST,MAX,MIN,STDDEV_POP,STDDEV_SAMP,SUM,VAR_POP,VAR_SAMP
- aggregation
Function FeatureEngineering Feature Function Aggregation Function - An aggregation function applied over a time window
- extra
Parameters FeatureEngineering Feature Function Extra Parameter[] - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
- function
Type string - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility.
The type of the function. Possible values are:
APPROX_COUNT_DISTINCT,APPROX_PERCENTILE,AVG,COUNT,FIRST,LAST,MAX,MIN,STDDEV_POP,STDDEV_SAMP,SUM,VAR_POP,VAR_SAMP
- aggregation_
function FeatureEngineering Feature Function Aggregation Function - An aggregation function applied over a time window
- extra_
parameters Sequence[FeatureEngineering Feature Function Extra Parameter] - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
- function_
type str - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility.
The type of the function. Possible values are:
APPROX_COUNT_DISTINCT,APPROX_PERCENTILE,AVG,COUNT,FIRST,LAST,MAX,MIN,STDDEV_POP,STDDEV_SAMP,SUM,VAR_POP,VAR_SAMP
- aggregation
Function Property Map - An aggregation function applied over a time window
- extra
Parameters List<Property Map> - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility. Extra parameters for parameterized functions
- function
Type String - Deprecated: Use the function oneof with AggregationFunction instead. Kept for backwards compatibility.
The type of the function. Possible values are:
APPROX_COUNT_DISTINCT,APPROX_PERCENTILE,AVG,COUNT,FIRST,LAST,MAX,MIN,STDDEV_POP,STDDEV_SAMP,SUM,VAR_POP,VAR_SAMP
FeatureEngineeringFeatureFunctionAggregationFunction, FeatureEngineeringFeatureFunctionAggregationFunctionArgs
- Approx
Count FeatureDistinct Engineering Feature Function Aggregation Function Approx Count Distinct - Approx
Percentile FeatureEngineering Feature Function Aggregation Function Approx Percentile - Avg
Feature
Engineering Feature Function Aggregation Function Avg - Count
Function FeatureEngineering Feature Function Aggregation Function Count Function - First
Feature
Engineering Feature Function Aggregation Function First - Last
Feature
Engineering Feature Function Aggregation Function Last - Max
Feature
Engineering Feature Function Aggregation Function Max - Min
Feature
Engineering Feature Function Aggregation Function Min - Stddev
Pop FeatureEngineering Feature Function Aggregation Function Stddev Pop - Stddev
Samp FeatureEngineering Feature Function Aggregation Function Stddev Samp - Sum
Feature
Engineering Feature Function Aggregation Function Sum - Time
Window FeatureEngineering Feature Function Aggregation Function Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- Var
Pop FeatureEngineering Feature Function Aggregation Function Var Pop - Var
Samp FeatureEngineering Feature Function Aggregation Function Var Samp
- Approx
Count FeatureDistinct Engineering Feature Function Aggregation Function Approx Count Distinct - Approx
Percentile FeatureEngineering Feature Function Aggregation Function Approx Percentile - Avg
Feature
Engineering Feature Function Aggregation Function Avg - Count
Function FeatureEngineering Feature Function Aggregation Function Count Function - First
Feature
Engineering Feature Function Aggregation Function First - Last
Feature
Engineering Feature Function Aggregation Function Last - Max
Feature
Engineering Feature Function Aggregation Function Max - Min
Feature
Engineering Feature Function Aggregation Function Min - Stddev
Pop FeatureEngineering Feature Function Aggregation Function Stddev Pop - Stddev
Samp FeatureEngineering Feature Function Aggregation Function Stddev Samp - Sum
Feature
Engineering Feature Function Aggregation Function Sum - Time
Window FeatureEngineering Feature Function Aggregation Function Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- Var
Pop FeatureEngineering Feature Function Aggregation Function Var Pop - Var
Samp FeatureEngineering Feature Function Aggregation Function Var Samp
- approx
Count FeatureDistinct Engineering Feature Function Aggregation Function Approx Count Distinct - approx
Percentile FeatureEngineering Feature Function Aggregation Function Approx Percentile - avg
Feature
Engineering Feature Function Aggregation Function Avg - count
Function FeatureEngineering Feature Function Aggregation Function Count Function - first
Feature
Engineering Feature Function Aggregation Function First - last
Feature
Engineering Feature Function Aggregation Function Last - max
Feature
Engineering Feature Function Aggregation Function Max - min
Feature
Engineering Feature Function Aggregation Function Min - stddev
Pop FeatureEngineering Feature Function Aggregation Function Stddev Pop - stddev
Samp FeatureEngineering Feature Function Aggregation Function Stddev Samp - sum
Feature
Engineering Feature Function Aggregation Function Sum - time
Window FeatureEngineering Feature Function Aggregation Function Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- var
Pop FeatureEngineering Feature Function Aggregation Function Var Pop - var
Samp FeatureEngineering Feature Function Aggregation Function Var Samp
- approx
Count FeatureDistinct Engineering Feature Function Aggregation Function Approx Count Distinct - approx
Percentile FeatureEngineering Feature Function Aggregation Function Approx Percentile - avg
Feature
Engineering Feature Function Aggregation Function Avg - count
Function FeatureEngineering Feature Function Aggregation Function Count Function - first
Feature
Engineering Feature Function Aggregation Function First - last
Feature
Engineering Feature Function Aggregation Function Last - max
Feature
Engineering Feature Function Aggregation Function Max - min
Feature
Engineering Feature Function Aggregation Function Min - stddev
Pop FeatureEngineering Feature Function Aggregation Function Stddev Pop - stddev
Samp FeatureEngineering Feature Function Aggregation Function Stddev Samp - sum
Feature
Engineering Feature Function Aggregation Function Sum - time
Window FeatureEngineering Feature Function Aggregation Function Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- var
Pop FeatureEngineering Feature Function Aggregation Function Var Pop - var
Samp FeatureEngineering Feature Function Aggregation Function Var Samp
- approx_
count_ Featuredistinct Engineering Feature Function Aggregation Function Approx Count Distinct - approx_
percentile FeatureEngineering Feature Function Aggregation Function Approx Percentile - avg
Feature
Engineering Feature Function Aggregation Function Avg - count_
function FeatureEngineering Feature Function Aggregation Function Count Function - first
Feature
Engineering Feature Function Aggregation Function First - last
Feature
Engineering Feature Function Aggregation Function Last - max
Feature
Engineering Feature Function Aggregation Function Max - min
Feature
Engineering Feature Function Aggregation Function Min - stddev_
pop FeatureEngineering Feature Function Aggregation Function Stddev Pop - stddev_
samp FeatureEngineering Feature Function Aggregation Function Stddev Samp - sum
Feature
Engineering Feature Function Aggregation Function Sum - time_
window FeatureEngineering Feature Function Aggregation Function Time Window - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- var_
pop FeatureEngineering Feature Function Aggregation Function Var Pop - var_
samp FeatureEngineering Feature Function Aggregation Function Var Samp
- approx
Count Property MapDistinct - approx
Percentile Property Map - avg Property Map
- count
Function Property Map - first Property Map
- last Property Map
- max Property Map
- min Property Map
- stddev
Pop Property Map - stddev
Samp Property Map - sum Property Map
- time
Window Property Map - Deprecated: Use Function.aggregation_function.time_window instead. Kept for backwards compatibility. The time window in which the feature is computed
- var
Pop Property Map - var
Samp Property Map
FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinct, FeatureEngineeringFeatureFunctionAggregationFunctionApproxCountDistinctArgs
- Input string
- Relative
Sd double - The maximum relative standard deviation allowed (default defined by Spark)
- Input string
- Relative
Sd float64 - The maximum relative standard deviation allowed (default defined by Spark)
- input String
- relative
Sd Double - The maximum relative standard deviation allowed (default defined by Spark)
- input string
- relative
Sd number - The maximum relative standard deviation allowed (default defined by Spark)
- input str
- relative_
sd float - The maximum relative standard deviation allowed (default defined by Spark)
- input String
- relative
Sd Number - The maximum relative standard deviation allowed (default defined by Spark)
FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentile, FeatureEngineeringFeatureFunctionAggregationFunctionApproxPercentileArgs
- Input string
- Percentile double
- The percentile value to compute (between 0 and 1)
- Accuracy int
- The accuracy parameter (higher is more accurate but slower)
- Input string
- Percentile float64
- The percentile value to compute (between 0 and 1)
- Accuracy int
- The accuracy parameter (higher is more accurate but slower)
- input String
- percentile Double
- The percentile value to compute (between 0 and 1)
- accuracy Integer
- The accuracy parameter (higher is more accurate but slower)
- input string
- percentile number
- The percentile value to compute (between 0 and 1)
- accuracy number
- The accuracy parameter (higher is more accurate but slower)
- input str
- percentile float
- The percentile value to compute (between 0 and 1)
- accuracy int
- The accuracy parameter (higher is more accurate but slower)
- input String
- percentile Number
- The percentile value to compute (between 0 and 1)
- accuracy Number
- The accuracy parameter (higher is more accurate but slower)
FeatureEngineeringFeatureFunctionAggregationFunctionAvg, FeatureEngineeringFeatureFunctionAggregationFunctionAvgArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionCountFunction, FeatureEngineeringFeatureFunctionAggregationFunctionCountFunctionArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionFirst, FeatureEngineeringFeatureFunctionAggregationFunctionFirstArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionLast, FeatureEngineeringFeatureFunctionAggregationFunctionLastArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionMax, FeatureEngineeringFeatureFunctionAggregationFunctionMaxArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionMin, FeatureEngineeringFeatureFunctionAggregationFunctionMinArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionStddevPop, FeatureEngineeringFeatureFunctionAggregationFunctionStddevPopArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionStddevSamp, FeatureEngineeringFeatureFunctionAggregationFunctionStddevSampArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionSum, FeatureEngineeringFeatureFunctionAggregationFunctionSumArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindow, FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowArgs
FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuous, FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowContinuousArgs
- Window
Duration string - Offset string
- The offset of the continuous window (must be non-positive)
- Window
Duration string - Offset string
- The offset of the continuous window (must be non-positive)
- window
Duration String - offset String
- The offset of the continuous window (must be non-positive)
- window
Duration string - offset string
- The offset of the continuous window (must be non-positive)
- window_
duration str - offset str
- The offset of the continuous window (must be non-positive)
- window
Duration String - offset String
- The offset of the continuous window (must be non-positive)
FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSliding, FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowSlidingArgs
- Slide
Duration string - The slide duration (interval by which windows advance, must be positive and less than duration)
- Window
Duration string
- Slide
Duration string - The slide duration (interval by which windows advance, must be positive and less than duration)
- Window
Duration string
- slide
Duration String - The slide duration (interval by which windows advance, must be positive and less than duration)
- window
Duration String
- slide
Duration string - The slide duration (interval by which windows advance, must be positive and less than duration)
- window
Duration string
- slide_
duration str - The slide duration (interval by which windows advance, must be positive and less than duration)
- window_
duration str
- slide
Duration String - The slide duration (interval by which windows advance, must be positive and less than duration)
- window
Duration String
FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumbling, FeatureEngineeringFeatureFunctionAggregationFunctionTimeWindowTumblingArgs
- Window
Duration string
- Window
Duration string
- window
Duration String
- window
Duration string
- window_
duration str
- window
Duration String
FeatureEngineeringFeatureFunctionAggregationFunctionVarPop, FeatureEngineeringFeatureFunctionAggregationFunctionVarPopArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionAggregationFunctionVarSamp, FeatureEngineeringFeatureFunctionAggregationFunctionVarSampArgs
- Input string
- Input string
- input String
- input string
- input str
- input String
FeatureEngineeringFeatureFunctionExtraParameter, FeatureEngineeringFeatureFunctionExtraParameterArgs
FeatureEngineeringFeatureLineageContext, FeatureEngineeringFeatureLineageContextArgs
- Job
Context FeatureEngineering Feature Lineage Context Job Context - Job context information including job ID and run ID
- Notebook
Id int - The notebook ID where this API was invoked
- Job
Context FeatureEngineering Feature Lineage Context Job Context - Job context information including job ID and run ID
- Notebook
Id int - The notebook ID where this API was invoked
- job
Context FeatureEngineering Feature Lineage Context Job Context - Job context information including job ID and run ID
- notebook
Id Integer - The notebook ID where this API was invoked
- job
Context FeatureEngineering Feature Lineage Context Job Context - Job context information including job ID and run ID
- notebook
Id number - The notebook ID where this API was invoked
- job_
context FeatureEngineering Feature Lineage Context Job Context - Job context information including job ID and run ID
- notebook_
id int - The notebook ID where this API was invoked
- job
Context Property Map - Job context information including job ID and run ID
- notebook
Id Number - The notebook ID where this API was invoked
FeatureEngineeringFeatureLineageContextJobContext, FeatureEngineeringFeatureLineageContextJobContextArgs
- job_
id int - The job ID where this API invoked
- job_
run_ intid - The job run ID where this API was invoked
FeatureEngineeringFeatureProviderConfig, FeatureEngineeringFeatureProviderConfigArgs
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- Workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id string - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace_
id str - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
- workspace
Id String - Workspace ID which the resource belongs to. This workspace must be part of the account which the provider is configured with.
FeatureEngineeringFeatureSource, FeatureEngineeringFeatureSourceArgs
FeatureEngineeringFeatureSourceDeltaTableSource, FeatureEngineeringFeatureSourceDeltaTableSourceArgs
- Full
Name string - The full three-part name (catalog, schema, name) of the feature
- Dataframe
Schema string - Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
- Entity
Columns List<string> - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
- Filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- Timeseries
Column string - Column recording time, used for point-in-time joins, backfills, and aggregations
- Transformation
Sql string - A single SQL SELECT expression applied after filter_condition.
Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have
transformation_sql", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
- Full
Name string - The full three-part name (catalog, schema, name) of the feature
- Dataframe
Schema string - Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
- Entity
Columns []string - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
- Filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- Timeseries
Column string - Column recording time, used for point-in-time joins, backfills, and aggregations
- Transformation
Sql string - A single SQL SELECT expression applied after filter_condition.
Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have
transformation_sql", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
- full
Name String - The full three-part name (catalog, schema, name) of the feature
- dataframe
Schema String - Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
- entity
Columns List<String> - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
- filter
Condition String - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- timeseries
Column String - Column recording time, used for point-in-time joins, backfills, and aggregations
- transformation
Sql String - A single SQL SELECT expression applied after filter_condition.
Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have
transformation_sql", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
- full
Name string - The full three-part name (catalog, schema, name) of the feature
- dataframe
Schema string - Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
- entity
Columns string[] - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
- filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- timeseries
Column string - Column recording time, used for point-in-time joins, backfills, and aggregations
- transformation
Sql string - A single SQL SELECT expression applied after filter_condition.
Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have
transformation_sql", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
- full_
name str - The full three-part name (catalog, schema, name) of the feature
- dataframe_
schema str - Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
- entity_
columns Sequence[str] - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
- filter_
condition str - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- timeseries_
column str - Column recording time, used for point-in-time joins, backfills, and aggregations
- transformation_
sql str - A single SQL SELECT expression applied after filter_condition.
Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have
transformation_sql", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
- full
Name String - The full three-part name (catalog, schema, name) of the feature
- dataframe
Schema String - Schema of the resulting dataframe after transformations, in Spark StructType JSON format (from df.schema.json()). Required if transformation_sql is specified. Example: {"type":"struct","fields":[{"name":<span pulumi-lang-nodejs=""colA"" pulumi-lang-dotnet=""ColA"" pulumi-lang-go=""colA"" pulumi-lang-python=""col_a"" pulumi-lang-yaml=""colA"" pulumi-lang-java=""colA"">"col_a","type":"integer","nullable":true,"metadata":{}},{"name":<span pulumi-lang-nodejs=""colC"" pulumi-lang-dotnet=""ColC"" pulumi-lang-go=""colC"" pulumi-lang-python=""col_c"" pulumi-lang-yaml=""colC"" pulumi-lang-java=""colC"">"col_c","type":"integer","nullable":true,"metadata":{}}]}
- entity
Columns List<String> - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity columns of the Delta table
- filter
Condition String - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- timeseries
Column String - Column recording time, used for point-in-time joins, backfills, and aggregations
- transformation
Sql String - A single SQL SELECT expression applied after filter_condition.
Should contains all the columns needed (eg. "SELECT , col_a + col_b AS col_c FROM x.y.z WHERE col_a > 0" would have
transformation_sql", col_a + col_b AS<span pulumi-lang-nodejs=" colC"" pulumi-lang-dotnet=" ColC"" pulumi-lang-go=" colC"" pulumi-lang-python=" col_c"" pulumi-lang-yaml=" colC"" pulumi-lang-java=" colC""> col_c") If transformation_sql is not provided, all columns of the delta table are present in the DataSource dataframe
FeatureEngineeringFeatureSourceKafkaSource, FeatureEngineeringFeatureSourceKafkaSourceArgs
- Name string
- Entity
Column List<FeatureIdentifiers Engineering Feature Source Kafka Source Entity Column Identifier> - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
- Filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- Timeseries
Column FeatureIdentifier Engineering Feature Source Kafka Source Timeseries Column Identifier - Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
- Name string
- Entity
Column []FeatureIdentifiers Engineering Feature Source Kafka Source Entity Column Identifier - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
- Filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- Timeseries
Column FeatureIdentifier Engineering Feature Source Kafka Source Timeseries Column Identifier - Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
- name String
- entity
Column List<FeatureIdentifiers Engineering Feature Source Kafka Source Entity Column Identifier> - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
- filter
Condition String - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- timeseries
Column FeatureIdentifier Engineering Feature Source Kafka Source Timeseries Column Identifier - Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
- name string
- entity
Column FeatureIdentifiers Engineering Feature Source Kafka Source Entity Column Identifier[] - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
- filter
Condition string - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- timeseries
Column FeatureIdentifier Engineering Feature Source Kafka Source Timeseries Column Identifier - Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
- name str
- entity_
column_ Sequence[Featureidentifiers Engineering Feature Source Kafka Source Entity Column Identifier] - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
- filter_
condition str - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- timeseries_
column_ Featureidentifier Engineering Feature Source Kafka Source Timeseries Column Identifier - Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
- name String
- entity
Column List<Property Map>Identifiers - Deprecated: Use Feature.entity instead. Kept for backwards compatibility. The entity column identifiers of the Kafka source
- filter
Condition String - Deprecated: Use DeltaTableSource.filter_condition or KafkaSource.filter_condition instead. Kept for backwards compatibility. The filter condition applied to the source data before aggregation
- timeseries
Column Property MapIdentifier - Deprecated: Use Feature.timeseries_column instead. Kept for backwards compatibility. The timeseries column identifier of the Kafka source
FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifier, FeatureEngineeringFeatureSourceKafkaSourceEntityColumnIdentifierArgs
- Variant
Expr stringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- Variant
Expr stringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- variant
Expr StringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- variant
Expr stringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- variant_
expr_ strpath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- variant
Expr StringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifier, FeatureEngineeringFeatureSourceKafkaSourceTimeseriesColumnIdentifierArgs
- Variant
Expr stringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- Variant
Expr stringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- variant
Expr StringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- variant
Expr stringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- variant_
expr_ strpath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
- variant
Expr StringPath - String representation of the column name or variant expression path. For nested fields, the leaf value is what will be present in materialized tables and expected to match at query time. For example, the leaf node of value:trip_details.location_details.pickup_zip is pickup_zip
FeatureEngineeringFeatureTimeWindow, FeatureEngineeringFeatureTimeWindowArgs
FeatureEngineeringFeatureTimeWindowContinuous, FeatureEngineeringFeatureTimeWindowContinuousArgs
- Window
Duration string - Offset string
- The offset of the continuous window (must be non-positive)
- Window
Duration string - Offset string
- The offset of the continuous window (must be non-positive)
- window
Duration String - offset String
- The offset of the continuous window (must be non-positive)
- window
Duration string - offset string
- The offset of the continuous window (must be non-positive)
- window_
duration str - offset str
- The offset of the continuous window (must be non-positive)
- window
Duration String - offset String
- The offset of the continuous window (must be non-positive)
FeatureEngineeringFeatureTimeWindowSliding, FeatureEngineeringFeatureTimeWindowSlidingArgs
- Slide
Duration string - The slide duration (interval by which windows advance, must be positive and less than duration)
- Window
Duration string
- Slide
Duration string - The slide duration (interval by which windows advance, must be positive and less than duration)
- Window
Duration string
- slide
Duration String - The slide duration (interval by which windows advance, must be positive and less than duration)
- window
Duration String
- slide
Duration string - The slide duration (interval by which windows advance, must be positive and less than duration)
- window
Duration string
- slide_
duration str - The slide duration (interval by which windows advance, must be positive and less than duration)
- window_
duration str
- slide
Duration String - The slide duration (interval by which windows advance, must be positive and less than duration)
- window
Duration String
FeatureEngineeringFeatureTimeWindowTumbling, FeatureEngineeringFeatureTimeWindowTumblingArgs
- Window
Duration string
- Window
Duration string
- window
Duration String
- window
Duration string
- window_
duration str
- window
Duration String
FeatureEngineeringFeatureTimeseriesColumn, FeatureEngineeringFeatureTimeseriesColumnArgs
- Name string
- Name string
- name String
- name string
- name str
- name String
Package Details
- Repository
- databricks pulumi/pulumi-databricks
- License
- Apache-2.0
- Notes
- This Pulumi package is based on the
databricksTerraform Provider.
published on Thursday, Mar 19, 2026 by Pulumi
