Java SDK

简介

此 SDK 适用于 Java 6 及以上版本。使用此 SDK 构建您的网络应用程序,能让您以非常便捷地方式操控网多云平台上的应用。

开源

安装

有两种方式安装使用Java SDK:

  • Maven方式
  • 手动下载jar包方式

Maven

使用Maven来将 Shenjian SDK for Java 导入到你的项目中。导入方法如下:

<dependency>
<groupId>io.shenjian.sdk</groupId>
<artifactId>java-sdk</artifactId>
<version>1.0.4</version>
</dependency>

手动下载jar包

请尽量使用包管理工具自动解决依赖问题。如果条件实在不满足,只能通过手动下载jar包的方式来解决。本项目自身jar及依赖的第三方库如下:

Java SDK依赖的第三方库如下:

库名称 库项目地址 库下载地址
httpclient 链接 链接
commons-io 链接 链接
fastjson 链接 链接
slf4j-api 链接 链接
slf4j-simple 链接 链接

可以点击每个库的下载链接,然后选择对应的jar进行下载,然后引入到项目中。

鉴权

网多云Java SDK的所有的功能,都需要合法的授权。授权凭证的签算需要网多云账号下的一对有效的 User KeyUser Secret ,这对密钥可以通过如下步骤获得:
点击这里查看 User Key 和 User Secret

获取用户信息

获取账户余额

获取用户账户的余额

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
double balance = client.getBalance();
System.out.println("balance: " + balance);
} catch (ShenjianException e) {
e.printStackTrace();
}

获取节点信息

获取用户所有节点的使用情况

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
NodeInfo nodeInfo = client.getNodeInfo();
System.out.println("total nodes: " + nodeInfo.getTotalNodes());
System.out.println("running nodes: " + nodeInfo.getRunningNodes());
} catch (ShenjianException e) {
e.printStackTrace();
}

获取应用信息

获取用户账号下所有应用的列表

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
int page = 1;
int pageSize = 50;
List<App> list = client.listApp(page, pageSize);
for (App aList : list) {
System.out.println("App ID: " + aList.getAppId());
System.out.println("Name: " + aList.getName());
System.out.println("Info: " + aList.getInfo());
System.out.println("Type: " + aList.getType());
System.out.println("Status: " + aList.getStatus());
System.out.println("Create Time: " + aList.getCreateTime());
}
} catch (ShenjianException e) {
e.printStackTrace();
}

爬虫控制

获取爬虫列表

获取用户账号下的爬虫列表

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
int page = 1;
int pageSize = 50;
List<Crawler> list = client.listCrawler(page, pageSize);
for (Crawler aList : list) {
System.out.println("App ID: " + aList.getAppId());
System.out.println("Name: " + aList.getName());
System.out.println("Info: " + aList.getInfo());
System.out.println("Type: " + aList.getType());
System.out.println("Status: " + aList.getStatus());
System.out.println("Create Time: " + aList.getCreateTime());
}
} catch (ShenjianException e) {
e.printStackTrace();
}

创建爬虫

在用户账号下新建一个爬虫APP,爬虫代码必须是可以在网多云平台下正确运行的JS代码。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
String name = "***爬虫的名字***";
String info = "***爬虫的描述***";
String code = "***爬虫代码***";
Crawler crawler = client.createCrawler(name, info, code);
System.out.println("Crawler ID : " + crawler.getAppId());
System.out.println("Crawler name : " + crawler.getName());
System.out.println("Crawler status : " + crawler.getStatus());
System.out.println("Create time : " + crawler.getCreateTime());
} catch (ShenjianException e) {
e.printStackTrace();
}

删除爬虫

删除指定爬虫。注意:爬虫删除后,爬取结果无法恢复,请谨慎调用。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
client.deleteCrawler(crawlerId);
} catch (ShenjianException e) {
e.printStackTrace();
}

修改爬虫信息

修改指定爬虫的信息,包括爬虫名和爬虫简介。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
String name = "要修改的名字";
String info = "要修改的描述";
client.editCrawler(crawlerId, name, info);
} catch (ShenjianException e) {
e.printStackTrace();
}

设置爬虫自定义项

修改爬虫的自定义设置,每个爬虫可设置的自定义项都是不同的。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
/*for example*/
/* 你的自定义项的map集合 */
Map<String, Object> configMap = new HashMap<String, Object>();
configMap.put("crawlerStore", true);
configMap.put("pageNum", 10);
configMap.put("productUrl", "https://item.jd.com/3724805.php");
configMap.put("keywords", new String[]{"男装","女装"});
client.configCrawlerCustom(crawlerId, configMap);
} catch (ShenjianException e) {
e.printStackTrace();
}

启动爬虫

启动指定爬虫,这一部分参数较多,详细的参数介绍点击这里
查看,此处只给出设置参数和启动爬虫的示例代码。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
int node = 1;
CrawlerTimer crawlerTimer = new CrawlerTimer();
/* 不同的启动模式和定时模式需要相应的套餐等级,详见开发文档与套餐介绍 */
/* 设置启动的模式 */
crawlerTimer.setCrawlerMode(true, true, DupType.CHANGE, ChangeType.INSERT);
/* 设置定时的模式*/
/* 单次定时启动 */
crawlerTimer.setTypeOnce("2018-1-30", "18:20","19:20" );
/* 按日定时启动 */
crawlerTimer.setTypeDaily("2018-1-30", "2018-2-30", "18:20","19:20" );
/* 按周定时启动 */
crawlerTimer.setTypeWeekly("2018-1-30","2018-2-30", new int[]{1,2,3,4,5,6,7},"18:20","19:20" );
/* 实时启动 */
crawlerTimer.setTypeCyclically("2018-1-30", "2018-2-30", Duration.TEN_MIN, Interval.ONE_HOUR);
/* 指定节点数定时启动 */
client.startCrawler(crawlerId, node, crawlerTimer);
/* 按默认的一个节点定时启动 */
client.startCrawler(crawlerId, crawlerTimer);
/* 指定节点数直接启动 */
client.startCrawler(crawlerId, node);
/* 按默认的一个节点直接启动 */
client.startCrawler(crawlerId);
} catch (ShenjianException e) {
e.printStackTrace();
}

停止爬虫

停止指定爬虫

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try{
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
client.stopCrawler(crawlerId);
} catch(ShenjianException e){
e.printStackTrace();
}

暂停爬虫

暂停指定爬虫

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try{
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
client.pauseCrawler(crawlerId);
} catch(ShenjianException e){
e.printStackTrace();
}

继续爬虫

启动指定的暂停状态爬虫

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try{
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
client.resumeCrawler(crawlerId);
} catch(ShenjianException e){
e.printStackTrace();
}

获取爬虫状态

获取指定爬虫的状态

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
AppStatus crawlerStatus = client.getCrawlerStatus(crawlerId);
System.out.println("Crawler status : " + crawlerStatus.name());
} catch (ShenjianException e) {
e.printStackTrace();
}

获取爬虫速率

获取指定爬虫的运行速度,单位是kB/s

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try{
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
float crawlerSpeed = client.getCrawlerSpeed(crawlerId);
System.out.println("Crawler speed = " + crawlerSpeed);
} catch(ShenjianException e){
e.printStackTrace();
}

修改爬虫节点

修改指定爬虫的节点数量,增加或减少多少节点,大于0表示增加,小于0表示减少,不能为0。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try{
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
int nodeDelta = 1;
CrawlerNodeInfo crawlerNodeInfo = client.changeCrawlerNode(crawlerId, nodeDelta);
System.out.println("Running nodes = " + crawlerNodeInfo.getRunningNodes());
System.out.println("Left nodes = " + crawlerNodeInfo.getLeftNodes());
} catch(ShenjianException e){
e.printStackTrace();
}

获取数据信息

获取爬虫对应的数据源信息。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try{
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
CrawlerSource crawlerSource = client.getCrawlerSource(crawlerId);
System.out.println("Crawler ID : " + crawlerSource.getAppId());
System.out.println("Type : " + crawlerSource.getType());
System.out.println("Count = " + crawlerSource.getCount());
} catch(ShenjianException e){
e.printStackTrace();
}

清空爬虫数据

清空爬虫的爬取结果。注意:爬取结果清空后无法恢复,请谨慎调用.

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
client.clearCrawlerData(crawlerId);
} catch (ShenjianException e) {
e.printStackTrace();
}

删除爬虫数据

删除爬虫N天前爬到的数据。注意:此接口调用后会立即返回,删除数据在后台进行。此操作不可取消,爬取结果删除后无法恢复,请谨慎调用.

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
/* 删除多少天前的数据,无默认值,最小为1 */
int days = 1;
client.deleteCrawlerData(crawlerId, days);
} catch (ShenjianException e) {
e.printStackTrace();
}

设置爬虫托管

修改指定爬虫的托管设置,五种托管类型:0-不托管,1-阿里云OSS,2-七牛云存储,4-又拍云;五种文件设置:1.image, 2.text, 3.audio, 5.application。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
HostType hostType = HostType.SHENJIANSHOU;
/* 根据需要选取要托管的文件类型 */
int fileTypeFlag = FileType.IMAGE | FileType.TEXT;
client.configCrawlerHost(crawlerId, hostType, fileTypeFlag);
} catch (ShenjianException e) {
e.printStackTrace();
}

获取webhook设置

获取指定爬虫的webhook设置

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
Webhook webhookInfo = client.getWebhookInfo(crawlerId);
System.out.println("Webhook URL : " + webhookInfo.getUrl());
System.out.println("Events : ");
for (int i = 0; i < webhookInfo.getEvents().length; i++){
System.out.println(i+1 + webhookInfo.getEvents()[i]);
}
} catch (ShenjianException e) {
e.printStackTrace();
}

删除webhook

删除爬虫的Webhook设置。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
client.deleteCrawlerWebhook(crawlerId);
} catch (ShenjianException e) {
e.printStackTrace();
}

修改webhook

修改爬虫的Webhook设置。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
String url = "你要设置的webhookURL";
/* 根据需要选取webhook事件 */
int eventFlag = WebhookEventType.DATA_NEW | WebhookEventType.DATA_UPDATED | WebhookEventType.MSG_CUSTOM;
client.configWebhookInfo(crawlerId, url, eventFlag);
} catch (ShenjianException e) {
e.printStackTrace();
}

获取爬虫自动发布状态

获取该爬虫对应数据源的自动发布状态。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
AutoPublishStatus autoPublishStatus = client.getAutoPublishStatus(crawlerId);
System.out.println("AutoPublish Status : " + autoPublishStatus.getStatus());
System.out.println("AutoPublish Message : " + autoPublishStatus.getMessage());
System.out.println("AutoPublish StopTime : " + autoPublishStatus.getStopTime());
} catch (ShenjianException e) {
e.printStackTrace();
}

开启自动发布

开启该爬虫对应数据源的自动发布(发布项目前只能通过网页创建,暂时不开放通过接口创建)。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
/* 此处填写要开启的发布项的ID */
int[] publishId = new int[]{555555,666666};
client.startAutoPublish(crawlerId, publishId);
} catch (ShenjianException e) {
e.printStackTrace();
}

停止自动发布

停止该爬虫对应数据源的自动发布。

String userKey = "<your user_key>";
String userSecret = "<your user_secret>";
ShenjianClient client = new ShenjianClient(userKey, userSecret);

try {
/* 此处填写要操作的爬虫的ID */
int crawlerId = 867247;
client.stopAutoPublish(crawlerId);
} catch (ShenjianException e) {
e.printStackTrace();
}

代码许可

Copyright (c) 2020 网多软件科技

基于 Apache 协议发布: