Is https://helm.gethue.com up-to-date?

Hi,

Is the Hue repo up-to-date on helm gethue com ?

When we install Hue from Helm repo on Kubernetes, we don’t see the same UI and features as shown on https://demo.gethue.com/

I’m using -

helm repo add gethue https://helm.gethue.com
helm repo update
helm install gethue/hue --generate-name

How can I get the most recent Hue so it matches the Hue on demo gethue com ?

Any help would be really appreciated!

Thanks,
Amit

Those should be the same as using gethue/hue:latest docker images. Do you have a screenshot?

@Romain Thanks for reaching out.

This is what I see with helm installation on Kubernetes as well as with gethue/hue:latest docker images and on hue.com

Also, I realized. On localhost, when I run any job (SELECT Query), It never returns any results on the Hue results screen.

Any idea what could be causing it?

What can I do to get new UI/ updates for Hue like demo.gethue.com?

Thanks,

Ha, this one is because the Editor and Connectors betas are turned on demo.gethue.com. Check “Beta” section on the top of https://docs.gethue.com/administrator/configuration/connectors/

@Romain

I’m using this example https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/functions/udfs.html#table-functions
which uses the below output type for TableFunction.

@FunctionHint(output = DataTypeHint(“ROW<word STRING, length INT>”))

When I use TableFunction with FunctionHint annotation, it works fine in Flink SQL CLI,
but when I run the same query in Hue using Ververica SQL Gateway, It doesn’t support FunctionHint and/or ROW<word STRING, length INT> output type.

Do you know if Ververica SQL Gateway (version Flink-1.11.1 ) supports FunctionHint for TableFunction with Flink 1.11.3 or Flink 1.12?

Is there a way to make it work? so It can call to Table Function which uses FunctionHint for output type?

Thanks,
Amit

What error are you getting?

The SQL Gateway should support it but you might need to register it in the Gateway config or at the SQL level

@Romain

Here are the details-

SQL Gateway configs

#===================================================================
Gateway server properties
#===================================================================
server:
//The address that the gateway binds itself.
//bind-address: 127.0.0.1
//The address that should be used by clients to connect to the gateway.
address: flink-jobmanager
//The port that the client connects to.
port: 8083
//The jvm args for SQL gateway process,
// like -Xmx2018m -Xms1024m -XX:+UseConcMarkSweepGC -XX:+PrintGCDetails -XX:+PrintGCDateStamps …
jvm_args: “-Xmx2018m -Xms1024m”

UDF TableFunction

import org.apache.flink.table.annotation.DataTypeHint;
import org.apache.flink.table.annotation.FunctionHint;
import org.apache.flink.table.api.*;
import org.apache.flink.table.functions.TableFunction;
import org.apache.flink.types.Row;

@FunctionHint(output = @DataTypeHint(“ROW<word STRING, length INT>”))
public class SplitFunction extends TableFunction {

public void eval(String str) {
    for (String s : str.split(" ")) {
        // use collect(...) to emit a row
        collect(Row.of(s, s.length()));
    }
}

}

This is the query I run in HUE

SELECT * FROM LATERAL TABLE(splitfunction(‘hello world’)) AS T(word, length);

The same UDF function works fine in Flink SQL CLI and returns this-

image

But when I run that in Hue, I see this error instead-

500 Server Error: Internal Server Error for url: http://flink-jobmanager:8083/v1/sessions/b30ada21db368b06bb5f8f63893aff0d/statements {“errors”:[“Internal server error.”,"<Exception on server side:\ncom.ververica.flink.table.gateway.utils.SqlExecutionException: Invalid SQL statement.\n\tat

org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:918)\n\tat org.apache.flink.shaded.netty4.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)\n\tat java.base/java.lang.Thread.run(Unknown Source)\n

Caused by: org.apache.flink.table.api.ValidationException: SQL validation failed. From line 1, column 64 to line 1, column 75: List of column aliases must have same degree as table; table has 1 columns (‘f0’), whereas alias list has 2 columns\n\tat

org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:146)\n\tat

org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$validate(FlinkPlannerImpl.scala:141)\n\t… 54 more\n

Caused by: org.apache.calcite.sql.validate.SqlValidatorException: List of column aliases must have same degree as table; table has 1 columns (‘f0’), whereas alias list has 2 columns\n\tat

java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)\n\tat java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)\n\tat java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)\n\tat java.base/java.lang.reflect.Constructor.newInstance(Unknown Source)\n\tat org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:457)\n\tat org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:550)\n\t… 69 more\n\nEnd of exception on server side>"]}

Please let me know what could be wrong?

You might need to include the jar in the Gateway service if putting it in its Flink config is not enough.

I am not the expert there, maybe ask on https://github.com/ververica/flink-sql-gateway ?