Questions tagged [apache-phoenix]

For questions about Apache Phoenix. For the Elixir web framework, use phoenix-framework.

For questions about Apache Phoenix. For the Elixir web framework, use .

702 questions
20
votes
1 answer

Storing data in HBase vs Parquet files

I am new to big data and am trying to understand the various ways of persisting and retrieving data. I understand both Parquet and HBase are column oriented storage formats but Parquet is a file oriented storage and not a database unlike HBase. My…
sovan
  • 363
  • 1
  • 4
  • 13
18
votes
4 answers

Can not read large data from phoenix table

Hi All i am getting below error message while running phoenix count query on a large table. 0: jdbc:phoenix:hadoopm1:2181> select Count(*) from PJM_DATASET; +------------+ | COUNT(1) | +------------+ java.lang.RuntimeException:…
user3683741
  • 181
  • 1
  • 5
14
votes
2 answers

JsonParseException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')

I have a program that inserts a new patient to HBase in a docker container inside a server. Everything is working fine until I try to change the connection IP to a phoenix query server for running JUnit tests. I am setting the URL in the properties…
randombee
  • 699
  • 1
  • 5
  • 26
14
votes
3 answers

How to export data to text file in Apache phoenix?

I'm quite new to HBase and Phoenix. But is there a way I can dump/export data to a text file? It would be highly appreciable if I can specify the field terminator, such as ',', '|>' etc. Thanks.
dehiker
  • 454
  • 1
  • 8
  • 21
14
votes
2 answers

Using Phoenix with Cloudera Hbase (installed from repo)

I can get Phoenix working on a standalone Apache Hbase (note, all this is for Hbase 1.0.0 on RHEL6.5) For the Cloudera flavour of Hbase however I never get it working without it throwing Exceptions. (even tried RHEL7 minimal as en OS) The same thing…
Havnar
  • 2,558
  • 7
  • 33
  • 62
14
votes
2 answers

HBase scans are slow

Problem I am trying to build a secondary index with Phoenix. Index creation takes several hours. It seems to be due to slow HBase scans, as I noticed the following performance : I might need 2 hours to scan the table, whereas other developers…
Martin Pernollet
  • 2,285
  • 1
  • 28
  • 39
10
votes
1 answer

Write Dataframe to Phoenix

i am trying to write Dataframe to Phoenix table but i am getting exception. Here is my code: df.write.format("org.apache.phoenix.spark").mode(SaveMode.Overwrite).options(collection.immutable.Map( "zkUrl" ->…
ROOT
  • 1,757
  • 4
  • 34
  • 60
10
votes
4 answers

How to export table schemas in apache phoenix?

I'd like to export the schema of an existing table in apache phoenix. Are there some commands or tools to do the same thing as show create table TABLE_NAME in mysql? thx
Weibo Li
  • 3,565
  • 3
  • 24
  • 36
9
votes
4 answers

Phoenix sqlline cannot display all columns of table on terminal

Use phoenix sqlline to connect the hbase. On SecureCRT terminal I can only see three columns of table which has more than 10 columns. I would like to display all columns of the table to test if data is ok. Is there any configuration should be…
Carl H
  • 405
  • 1
  • 8
  • 20
8
votes
1 answer

java.lang.IllegalArgumentException: Unable to PTableType enum for value of 'MATERIALIZED VIEW' Exception with Phoenix and Hbase

I am very much new to Saiku. I am trying to integrate the saiku with phoenix. Phoenix intern connect with HBase. I created a schema and when the Saiku tries to load phoenix schema xml, Am triggered with below error. I am working restless to figure…
venky
  • 400
  • 1
  • 6
  • 19
8
votes
4 answers

Big data with very fast access

I am facing to a problem: database for process plants. There are up to 50,000 sensors at sampling rate of 50 ms. All measured values need to be stored at least 3 years and must support real-time queries (i.e. users can see historical data with delay…
duong_dajgja
  • 4,196
  • 1
  • 38
  • 65
8
votes
1 answer

Google Cloud Bigtable coprocessor support

Google Cloud BigTable doesn't support coprocessors: Coprocessors are not supported. You cannot create classes that implement the interface org.apache.hadoop.hbase.coprocessor. https://cloud.google.com/bigtable/docs/hbase-differences I can…
Sergei Rodionov
  • 4,079
  • 6
  • 27
  • 44
8
votes
2 answers

Apache Phoenix vs Hive-Spark

What's faster/easier to convert into SQL, that accept SQL scripts as input: Spark SQL which comes as a layer of speed for Hive high latency queries or Phoenix? And if so, how? I need to do a lot of upserts/joining/grouping over the data. [hbase] Is…
kraster
  • 297
  • 1
  • 5
  • 13
7
votes
1 answer

HBase Cluster- Can't connect to hbase via phoenix client

I am trying to connect HBase cluster via Phoenix. First, I have copied Phoenix client and query server jars files to HMaster and HRegion lib folder and restarted HBase services. Server - Started the Phoenix server via /bin/queryserver.py. Its…
BASS KARAN
  • 181
  • 6
6
votes
0 answers

How to save a dataframe into HBase?

I have a df with a schema, also create a table in HBase with phoenix. What i want is to save this df to HBase using spark. I have tried the descriptions in the following link and run the spark-shell with phoenix plugin dependencies. spark-shell…
Saygın Doğu
  • 305
  • 1
  • 4
  • 17
1
2 3
46 47