7

Have a job which takes around 1/2 minutes to finish, now trying to run this job through the command line just goes on forever and doesn't finish. It doesn't look like I get any errors from this either. So the job seems to be starting and I know the job works correctly since it works within spoon, any ideas?

C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration> Kitchen.bat /file:C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration\job.kjb /level:Minimal

DEBUG: Using PENTAHO_JAVA_HOME DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jre1.8.0_231 DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jre1.8.0_231\bin\java.exe

C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration>"C:\Program Files\Java\jre1.8.0_231\bin\java.exe" "-Xms1024m" "-Xmx2048m" "-XX:MaxPermSize=256m" "-Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\launcher.jar -lib ..\libswt\win64 -main org.pentaho.di.kitchen.Kitchen -initialDir "C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration"\ /file:C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration\job.kjb /level:Minimal Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0 13:58:07,867 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled 13:58:12,006 INFO [KarafInstance]


* Karaf Instance Number: 2 at C:\Users\a\Downloads\pdi-ce-8.3.0.0- 371\data-integration.\system\karaf\caches\kitchen\data-1 FastBin Provider Port:52902 Karaf Port:8803 OSGI Service Port:9052 * ******************************************************************************* Dec 19, 2019 1:58:12 PM org.apache.karaf.main.Main$KarafLockCallback lockAquired INFO: Lock acquired. Setting startlevel to 100 2019/12/19 13:58:12 - Kitchen - Logging is at level : Minimal 2019/12/19 13:58:12 - Kitchen - Start of run. 2019-12-19 13:58:15.902:INFO:oejs.Server:jetty-8.1.15.v20140411 2019-12-19 13:58:15.955:INFO:oejs.AbstractConnector:Started NIOSocketConnectorWrapper@0.0.0.0:9052 Dec 19, 2019 1:58:16 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-management (182) [org.apache.cxf.management.InstrumentationManager] Dec 19, 2019 1:58:16 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-transports-http (183) [org.apache.cxf.transport.http.HTTPTransportFactory, org.apache.cxf.transport.http.HTTPWSDLExtensionLoader, org.apache.cxf.transport.http.policy.HTTPClientAssertionBuilder, org.apache.cxf.transport.http.policy.HTTPServerAssertionBuilder, org.apache.cxf.transport.http.policy.NoOpPolicyInterceptorProvider] Dec 19, 2019 1:58:16 PM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess INFO: New Caching Service registered 2019/12/19 13:58:17 - job - Start of job execution Dec 19, 2019 1:58:18 PM org.apache.cxf.endpoint.ServerImpl initDestination INFO: Setting the server's publish address to be /lineage Dec 19, 2019 1:58:18 PM org.apache.cxf.endpoint.ServerImpl initDestination INFO: Setting the server's publish address to be /i18n Dec 19, 2019 1:58:19 PM org.apache.cxf.endpoint.ServerImpl initDestination INFO: Setting the server's publish address to be /marketplace

Update Tried deleting kitchen cache from Karaf cache starting running but job never finished, now I'm running the job with a debug level and getting these results. Still, the job doesn't get any further than this, Job works in spoon so cannot be related to the job.

C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration>kitchen.bat /file:C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration\Job.kjb /level:Debug

DEBUG: Using PENTAHO_JAVA_HOME

DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jre1.8.0_231

DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jre1.8.0_231\bin\java.exe

C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration>"C:\Program Files\Java\jre1.8.0_231\bin\java.exe" "-Xms1024m" "-Xmx2048m" "-XX:MaxPermSize=256m" "-Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2" "-Djava.library.path=libswt\win64" "-DKETTLE_HOME=" "-DKETTLE_REPOSITORY=" "-DKETTLE_USER=" "-DKETTLE_PASSWORD=" "-DKETTLE_PLUGIN_PACKAGES=" "-DKETTLE_LOG_SIZE_LIMIT=" "-DKETTLE_JNDI_ROOT=" -jar launcher\launcher.jar -lib ..\libswt\win64 -main org.pentaho.di.kitchen.Kitchen -initialDir "C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration"\ /file:C:\Users\a\Downloads\pdi-ce-8.3.0.0-371\data-integration\Job.kjb /level:Debug

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0

08:07:33,026 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled

08:07:37,211 INFO [KarafInstance]


* Karaf Instance Number: 1 at C:\Users\a\Downloads\pdi-ce-8.3.0.0- *

* 371\data-integration.\system\karaf\caches\kitchen\data-1 *

* FastBin Provider Port:52901 *

* Karaf Port:8802 *

* OSGI Service Port:9051 *


Dec 23, 2019 8:07:38 AM org.apache.karaf.main.Main$KarafLockCallback lockAquired

INFO: Lock acquired. Setting startlevel to 100

2019/12/23 08:07:38 - Kitchen - Logging is at level : Debug

2019/12/23 08:07:38 - Kitchen - Start of run.

2019/12/23 08:07:38 - Kitchen - Allocate new job.

2019/12/23 08:07:38 - Kitchen - Parsing command line options.

2019-12-23 08:07:43.475:INFO:oejs.Server:jetty-8.1.15.v20140411

2019-12-23 08:07:43.538:INFO:oejs.AbstractConnector:Started NIOSocketConnectorWrapper@0.0.0.0:9051

Dec 23, 2019 8:07:43 AM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions

INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-management (182) [org.apache.cxf.management.InstrumentationManager]

Dec 23, 2019 8:07:43 AM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions

INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-transports-http (183) [org.apache.cxf.transport.http.HTTPTransportFactory, org.apache.cxf.transport.http.HTTPWSDLExtensionLoader, org.apache.cxf.transport.http.policy.HTTPClientAssertionBuilder, org.apache.cxf.transport.http.policy.HTTPServerAssertionBuilder, org.apache.cxf.transport.http.policy.NoOpPolicyInterceptorProvider]

Dec 23, 2019 8:07:44 AM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess

INFO: New Caching Service registered

2019/12/23 08:07:45 - Job - Start of job execution

2019/12/23 08:07:45 - Job - exec(0, 0, START.0)

2019/12/23 08:07:45 - START - Starting job entry

2019/12/23 08:07:45 - Job - Job

Dec 23, 2019 8:07:46 AM org.apache.cxf.endpoint.ServerImpl initDestination

INFO: Setting the server's publish address to be /lineage

Dec 23, 2019 8:07:47 AM org.apache.cxf.endpoint.ServerImpl initDestination

INFO: Setting the server's publish address to be /i18n

Dec 23, 2019 8:07:48 AM org.apache.cxf.endpoint.ServerImpl initDestination

INFO: Setting the server's publish address to be /marketplace

2019/12/23 08:07:55 - Job - Triggering heartbeat signal for Job at every 10 seconds

OCTAVIAN
  • 336
  • 5
  • 18
  • Can you try deleting the `C:\Users\a\Downloads\pdi-ce-8.3.0.0- 371\data-integration.\system\karaf\caches\kitchen` directory and then rerun kitchen? I'm curious if your karaf is just corrupted for whatever reason. Kitchen will rebuild those caches the next time you run kitchen. – eicherjc Dec 19 '19 at 21:40
  • @eicherjc went a bit further by doing that and got to the purge stale after 1440 minutes then never complete, job takes 1 minute in pentaho waited 20 minutes before termination, also re-ran after that and it looks like its back to the initial issue again – OCTAVIAN Dec 20 '19 at 08:13
  • I just tried running a job through kitchen using the same version of Spoon you're using and didn't have an issues. No hangs. If you run the job on log level of Debug instead of Minimal, does that give anymore useful info at all? If not, can you tell me some of the steps you're using in the job? – eicherjc Dec 20 '19 at 21:40
  • Just updated the post with the results from your steps mate @eicherjc – OCTAVIAN Dec 23 '19 at 08:17
  • What do the job? (what are the /lineage, /i18n and /marketplace endpoint ?) If the job cannot finish, I assume it starts? can you tell us if you can check an output ( like nb of lines in a database growing/updated/shrinking ? ) Is is starting going low? Is you CPU going crazy when you run kitchen ? is kitchen doing something/waiting something ? (I would also have use jvisualvm / jtop / jheap ... to check what is going on the inner jvm) – wargre Dec 28 '19 at 18:59
  • you can also try deleting all the pentaho related folder from your users location, re-launch pdi and run the job. – Helping Hand.. Jan 09 '20 at 06:49
  • @HelpingHand.. the job works in pentaho(spoon) though? I've tried deleting spoon cache and re-running it through spoon which was fine but didn't change anything in terms of the command line – OCTAVIAN Jan 09 '20 at 13:40
  • @HelpingHand.. the pdi is stored in my downloads, which folders should I be looking at – OCTAVIAN Jan 09 '20 at 13:41
  • i am taking about folders inside users C:\Users\yourusername\ in this location you will find .kettle and .pentaho folders remove it. before removing close the PDI client. once you will re-launch this folders will going to create automatically again. – Helping Hand.. Jan 10 '20 at 05:44

2 Answers2

0

Something deeper must have been corrupted, as I deleted all files, downloaded the latest version, and it worked.

OCTAVIAN
  • 336
  • 5
  • 18
0

to run from command line you have to run below command

path to kitchen.sh/kitchen.sh -file=".ktr filename" --level=Debug >> "log.txt"

Jay Kakadiya
  • 501
  • 1
  • 5
  • 12