Skip to content

Conversation

@Yohahaha
Copy link
Contributor

Purpose

Linked issue: close #xxx

Brief change log

As title.

Tests

Catalog: set/remove table properties in org.apache.fluss.spark.SparkCatalogTest

API and Format

Documentation

@Yohahaha
Copy link
Contributor Author

@YannByron @wuchong

@Yohahaha
Copy link
Contributor Author

flink ut failed with Error: The action 'Test' has timed out after 60 minutes.

throw e
}
case e: UnsupportedOperationException =>
throw new IllegalArgumentException(e)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just throw the original UnsupportedOperationException.


override def alterTable(ident: Identifier, changes: TableChange*): Table = {
throw new UnsupportedOperationException("Altering table is not supported")
if (
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since toFlussTableChanges will match the supported TableChange, here we don't have to check them first.

checkAnswer(sql("SHOW DATABASES"), Row(DEFAULT_DATABASE) :: Nil)
}

test("Catalog: set/remove table properties") {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it'll be nice to have a separate SparkTableChangeTest, which includes all Alter Table tests.

assert(
flussTable.getCustomProperties.toMap.asScala.getOrElse("key1", "non-exists") == "value1")

sql(s"ALTER TABLE t SET TBLPROPERTIES('key1' = 'value2')")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add cases which set/unset more than one table properties.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 for this

assert(
flussTable.getCustomProperties.toMap.asScala.getOrElse("key1", "non-exists") == "value1")

sql(s"ALTER TABLE t SET TBLPROPERTIES('key1' = 'value2')")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There must be some table properties which can't be changed, e.g. table.datalake.format, I guess. Think about this case, and update this pr. @wuchong maybe can give some advice.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 for adding a test that sets table.* properties. Currently, only table.datalake.format can be altered, and the cluster must be configured with the datalake.format setting.

@YannByron
Copy link
Contributor

@Yohahaha left some comments here, and I suggest to simplify this title to [spark] support to set/unset table properties.

Copy link
Member

@wuchong wuchong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @Yohahaha, I agree with @YannByron's comments. Could you please update the pull request to address them?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants