java 远程调试

来源:互联网 发布:linux下sleep函数 编辑:程序博客网 时间:2024/05/22 04:34
【class文件调试】
Main.class源码:
public class Main {    public static void main(String[] args) throws InterruptedException {        while(true) {            System.out.println("myname is debug");            Thread.sleep(2);            System.out.println("my name is linux");        }    }}

linux 启动带debug模式的jvm虚拟机:

java -Xdebug -Xrunjdwp:transport=dt_socket,address=5000,server=y,suspend=y MainListening for transport dt_socket at address: 5000

这里需要提前编译好Main.class并上传到linux服务端

在idea的run->Configuration->新建一个Remote模式,在右侧的Settings标签内填入Host 和 Port 之后,在Main.java中加入断点,点击即可开始debug 
参数解释:address=5000 表明jvm debug虚拟机在5000端口上, server=y/n 参数是指是否支持在server模式的VM中, suspend=y/n是否在idea客户端连接上jvm 5000端口
之后才开始执行Main.class
【jar包调试】
Main.class源码是一样的,启动jvm远程调试功能
java -Xdebug -Xrunjdwp:transport=dt_socket,address=5000,server=y,suspend=y -cp remoteDebug-1.0-SNAPSHOT-jar-with-dependencies.jar Main
Listening for transport dt_socket at address: 5000
-cp 指定jar包路径,Main是该jar包的main函数入口,如有有包,还需带上包名如com.xxxx.Main
之后的操作和上述相同


【spark streaming job的远程调试】

用mvn打包好带依赖的jar包,如:bsaata_merge-1.0-jar-with-dependencies.jar

则正常提交该spark-streaming job的方式为:

spark-1.3.0-bin-hadoop2.4/bin/spark-submit --master yarn-client --num-executors 3 --executor-memory 4g --driver-memory 2g --class com.nsfocus.bsaata.merge.main.App /home/bsauser/BSA/apps/bsa_ata/bin/bsaata_merge-1.0-jar-with-dependencies.jar --jars $SPARK_HOME/lib/postgresql-9.4-1201.jdbc41.jar

采用远程调试模式启动该spark-streamming job的方式为(让jvm挂起:suspend=y):

spark-1.3.0-bin-hadoop2.4/bin/spark-submit --conf "spark.driver.extraJavaOptions=-agentlib:jdwp=transport=dt_socket,address=5000,server=y,suspend=y" --master yarn-client --num-executors 3 --executor-memory 4g --driver-memory 2g --class com.nsfocus.bsaata.merge.main.App /home/bsauser/bsaata_merge-1.0-jar-with-dependencies.jar --jars $SPARK_HOME/lib/postgresql-9.4-1201.jdbc41.jar

同时,需要注意在${SPARK_HOME}/conf/spark-env.sh中需要注释export SPARK_JAVA_OPTS参数项,避免和extraJavaOption冲突

同上,在idea的configure上配置好host和port就可以在idea上添加断点,单步执行


上述步骤执行到sparkContext对象时idea 结束debug,异常退出(需要debug spark源码):这时需要在idea上安装scala插件,将spark-streaming的源码下载到本地的mvn仓库,才可以继续debug


0 0