Check your configuration as follows:
$ shopt
...
huponexit off
...
$
As you can see, `huponexit` is `off`. So you don't have to use `nohup` in this case.
Sunday, December 20, 2015
Friday, December 18, 2015
AdSense is not showing up on Blogspot
If you set up AdSense on Blogspot but ADs are not showing up,
wait a few minutes.
For me, it was blanks but after a few minutes ADs shows up.
wait a few minutes.
For me, it was blanks but after a few minutes ADs shows up.
Deploy a jar file to Maven repository in Gradle
To deploy a jar file to Maven repository in Gradle,
add the following configuration to `build.gradle`:
apply plugin: 'maven'
repositories {
mavenLocal()
jcenter()
maven { url "http://repo.izeye.com:8080/repository/internal" }
}
uploadArchives {
repositories {
mavenDeployer {
repository(url: "http://repo.izeye.com:8080/repository/internal") {
authentication(userName: "admin", password: "1234")
}
}
}
}
If you encounter the following error:
Could not find metadata :test/maven-metadata.xml in remote (http://repo.izeye.com:8080/repository/internal)
make sure you added it to `repositories`.
Reference:
https://docs.gradle.org/current/userguide/maven_plugin.html
add the following configuration to `build.gradle`:
apply plugin: 'maven'
repositories {
mavenLocal()
jcenter()
maven { url "http://repo.izeye.com:8080/repository/internal" }
}
uploadArchives {
repositories {
mavenDeployer {
repository(url: "http://repo.izeye.com:8080/repository/internal") {
authentication(userName: "admin", password: "1234")
}
}
}
}
If you encounter the following error:
Could not find metadata :test/maven-metadata.xml in remote (http://repo.izeye.com:8080/repository/internal)
make sure you added it to `repositories`.
Reference:
https://docs.gradle.org/current/userguide/maven_plugin.html
Tuesday, December 15, 2015
rsync: failed to connect to 1.2.3.4: Connection refused (111)
If you encounter the following error:
$ rsync -avr 1.2.3.4::I/home/izeye/test/ .
rsync: failed to connect to 1.2.3.4: Connection refused (111)
rsync error: error in socket IO (code 10) at clientserver.c(124) [receiver=3.0.6]
$
checkout your rsync configuration as follows:
$ sudo vi /etc/xinetd.d/rsync
service rsync
{
disable = yes
flags = IPv6
socket_type = stream
wait = no
user = root
server = /usr/bin/rsync
server_args = --daemon
log_on_failure += USERID
}
and change `disable` from `yes` to `no`.
Restart the xinetd as follows:
$ sudo /etc/init.d/xinetd restart
As a side note, if you encounter the following error:
$ rsync -avr 1.2.3.4::I/home/izeye/test/ .
@ERROR: access denied to I from unknown (5.6.7.8)
rsync error: error starting client-server protocol (code 5) at main.c(1503) [receiver=3.0.6]
$
checkout your `/etc/rsyncd.conf`, especially `hosts allow` and `hosts deny` values.
$ rsync -avr 1.2.3.4::I/home/izeye/test/ .
rsync: failed to connect to 1.2.3.4: Connection refused (111)
rsync error: error in socket IO (code 10) at clientserver.c(124) [receiver=3.0.6]
$
checkout your rsync configuration as follows:
$ sudo vi /etc/xinetd.d/rsync
service rsync
{
disable = yes
flags = IPv6
socket_type = stream
wait = no
user = root
server = /usr/bin/rsync
server_args = --daemon
log_on_failure += USERID
}
and change `disable` from `yes` to `no`.
Restart the xinetd as follows:
$ sudo /etc/init.d/xinetd restart
As a side note, if you encounter the following error:
$ rsync -avr 1.2.3.4::I/home/izeye/test/ .
@ERROR: access denied to I from unknown (5.6.7.8)
rsync error: error starting client-server protocol (code 5) at main.c(1503) [receiver=3.0.6]
$
checkout your `/etc/rsyncd.conf`, especially `hosts allow` and `hosts deny` values.
rm: cannot unlink `/c/Users/izeye/IdeaProjects/impression-neo/.git/rebase-merge/.git-rebase-todo.swp': Permission denied
If you encounter the following error during aborting rebase:
C:\Users\izeye\IdeaProjects\impression-neo>git rebase --abort
rm: cannot unlink `/c/Users/izeye/IdeaProjects/impression-neo/.git/rebase-merge/.git-rebase-todo.swp': Permission denied
rm: cannot remove directory `/c/Users/izeye/IdeaProjects/impression-neo/.git/rebase-merge': Directory not empty
C:\Users\izeye\IdeaProjects\impression-neo>
Kill `vim.exe`.
C:\Users\izeye\IdeaProjects\impression-neo>git rebase --abort
rm: cannot unlink `/c/Users/izeye/IdeaProjects/impression-neo/.git/rebase-merge/.git-rebase-todo.swp': Permission denied
rm: cannot remove directory `/c/Users/izeye/IdeaProjects/impression-neo/.git/rebase-merge': Directory not empty
C:\Users\izeye\IdeaProjects\impression-neo>
Kill `vim.exe`.
Force to push a specifc branch to origin in Git
To force to push a specifc branch to origin in Git,
do as follows:
git push -f origin gh-1004
Reference:
http://stackoverflow.com/questions/11453807/force-push-current-branch
do as follows:
git push -f origin gh-1004
Reference:
http://stackoverflow.com/questions/11453807/force-push-current-branch
Monday, November 30, 2015
File exists. error: failed to run pack-refs
If you encounter the following error:
Error pulling origin
fatal: Unable to create 'C:/Users/nbp/IdeaProjects/xxx/.git/packed-refs.lock': File exists. error: failed to run pack-refs
do as follows:
del .git/packed-refs.lock
although I don't know why this happened.
Error pulling origin
fatal: Unable to create 'C:/Users/nbp/IdeaProjects/xxx/.git/packed-refs.lock': File exists. error: failed to run pack-refs
do as follows:
del .git/packed-refs.lock
although I don't know why this happened.
Monday, November 23, 2015
How to exclude `src/main/resources` from a jar file in Gradle
To exclude `src/main/resources` from a jar file in Gradle, add the following to `build.gradle`:
processResources {
exclude '**'
}
processResources {
exclude '**'
}
How to refresh a snapshot dependency in Gradle
To refresh a snapshot dependency in Gradle, add the following to `build.gradle`:
configurations.all {
resolutionStrategy.cacheChangingModulesFor 0, 'seconds'
}
Reference:
https://discuss.gradle.org/t/how-to-get-gradle-to-download-newer-snapshots-to-gradle-cache-when-using-an-ivy-repository/7344
configurations.all {
resolutionStrategy.cacheChangingModulesFor 0, 'seconds'
}
Reference:
https://discuss.gradle.org/t/how-to-get-gradle-to-download-newer-snapshots-to-gradle-cache-when-using-an-ivy-repository/7344
Tuesday, November 17, 2015
ElasticsearchIllegalArgumentException[failed to execute script]; nested: ScriptException[scripts of type [inline], operation [update] and lang [groovy] are disabled];
When you use inline scripting as follows:
$ curl -XPOST 'localhost:9200/customer/external/1/_update?pretty' -d '
{
"script" : "ctx._source.age += 5"
}'
you might get the following error:
{
"error" : "ElasticsearchIllegalArgumentException[failed to execute script]; nested: ScriptException[scripts of type [inline], operation [update] and lang [groovy] are disabled]; ",
"status" : 400
}
Because inlining scripting is disabled since 1.4.3, you should enable inline scripting explicitly as follows:
config/elasticsearch.yml
script.inline: on
You can check the result as follows:
$ curl 'localhost:9200/customer/external/1?pretty'
References:
https://www.elastic.co/guide/en/elasticsearch/reference/1.7/_updating_documents.html
https://www.elastic.co/guide/en/elasticsearch/reference/1.7/modules-scripting.html
$ curl -XPOST 'localhost:9200/customer/external/1/_update?pretty' -d '
{
"script" : "ctx._source.age += 5"
}'
you might get the following error:
{
"error" : "ElasticsearchIllegalArgumentException[failed to execute script]; nested: ScriptException[scripts of type [inline], operation [update] and lang [groovy] are disabled]; ",
"status" : 400
}
Because inlining scripting is disabled since 1.4.3, you should enable inline scripting explicitly as follows:
config/elasticsearch.yml
script.inline: on
You can check the result as follows:
$ curl 'localhost:9200/customer/external/1?pretty'
References:
https://www.elastic.co/guide/en/elasticsearch/reference/1.7/_updating_documents.html
https://www.elastic.co/guide/en/elasticsearch/reference/1.7/modules-scripting.html
Wednesday, November 4, 2015
How to use a constructor when deserializing in Jackson
To use a constructor when deserializing in Jackson,
do as follows:
@Data
public class ClickUrl {
private final String url;
private String finalUrl;
@JsonCreator
public AdcrClickUrl(@JsonProperty("url") String url) {
this.url = url;
}
}
Reference:
http://www.cowtowncoder.com/blog/archives/2011/07/entry_457.html
do as follows:
@Data
public class ClickUrl {
private final String url;
private String finalUrl;
@JsonCreator
public AdcrClickUrl(@JsonProperty("url") String url) {
this.url = url;
}
}
Reference:
http://www.cowtowncoder.com/blog/archives/2011/07/entry_457.html
Friday, October 30, 2015
How to add images to output with AsciiDoctor Gradle plugin
To add images to output with AsciiDoctor Gradle plugin
add the following configuration to your `build.gradle` file:
asciidoctor {
resources {
from (sourceDir) {
include "**/*.png"
}
}
}
Reference:
https://github.com/asciidoctor/asciidoctor-gradle-plugin
add the following configuration to your `build.gradle` file:
asciidoctor {
resources {
from (sourceDir) {
include "**/*.png"
}
}
}
Reference:
https://github.com/asciidoctor/asciidoctor-gradle-plugin
ERR_CONNECTION_ABORTED in Chrome when you deploy a WAR file to Tomcat via Manager App
You might get ERR_CONNECTION_ABORTED in Chrome when you deploy a WAR file to Tomcat via a Manager App.
If so, check your log file for the Manager App, `logs/manager.2015-10-30.log`.
It might look like this:
30-Oct-2015 15:49:08.586 SEVERE [http-nio-8080-exec-8] org.apache.catalina.core.ApplicationContext.log HTMLManager: FAIL - Deploy Upload Failed, Exception: org.apache.tomcat.util.http.fileupload.FileUploadBase$SizeLimitExceededException: the request was rejected because its size (59114627) exceeds the configured maximum (52428800)
java.lang.IllegalStateException: org.apache.tomcat.util.http.fileupload.FileUploadBase$SizeLimitExceededException: the request was rejected because its size (59114627) exceeds the configured maximum (52428800)
at org.apache.catalina.connector.Request.parseParts(Request.java:2804)
at org.apache.catalina.connector.Request.parseParameters(Request.java:3073)
at org.apache.catalina.connector.Request.getParameter(Request.java:1095)
at org.apache.catalina.connector.RequestFacade.getParameter(RequestFacade.java:380)
at org.apache.catalina.filters.CsrfPreventionFilter.doFilter(CsrfPreventionFilter.java:185)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:108)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:217)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:614)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:673)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1500)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.tomcat.util.http.fileupload.FileUploadBase$SizeLimitExceededException: the request was rejected because its size (59114627) exceeds the configured maximum (52428800)
at org.apache.tomcat.util.http.fileupload.FileUploadBase$FileItemIteratorImpl.<init>(FileUploadBase.java:811)
at org.apache.tomcat.util.http.fileupload.FileUploadBase.getItemIterator(FileUploadBase.java:256)
at org.apache.tomcat.util.http.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:280)
at org.apache.catalina.connector.Request.parseParts(Request.java:2734)
... 28 more
If you encounter the above error, you should change file size in `webapps/manager/WEB-INF/web.xml` appropriately:
<multipart-config>
<!-- 50MB max -->
<max-file-size>52428800</max-file-size>
<max-request-size>52428800</max-request-size>
<file-size-threshold>0</file-size-threshold>
</multipart-config>
If so, check your log file for the Manager App, `logs/manager.2015-10-30.log`.
It might look like this:
30-Oct-2015 15:49:08.586 SEVERE [http-nio-8080-exec-8] org.apache.catalina.core.ApplicationContext.log HTMLManager: FAIL - Deploy Upload Failed, Exception: org.apache.tomcat.util.http.fileupload.FileUploadBase$SizeLimitExceededException: the request was rejected because its size (59114627) exceeds the configured maximum (52428800)
java.lang.IllegalStateException: org.apache.tomcat.util.http.fileupload.FileUploadBase$SizeLimitExceededException: the request was rejected because its size (59114627) exceeds the configured maximum (52428800)
at org.apache.catalina.connector.Request.parseParts(Request.java:2804)
at org.apache.catalina.connector.Request.parseParameters(Request.java:3073)
at org.apache.catalina.connector.Request.getParameter(Request.java:1095)
at org.apache.catalina.connector.RequestFacade.getParameter(RequestFacade.java:380)
at org.apache.catalina.filters.CsrfPreventionFilter.doFilter(CsrfPreventionFilter.java:185)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:108)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:217)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:614)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:673)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1500)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.tomcat.util.http.fileupload.FileUploadBase$SizeLimitExceededException: the request was rejected because its size (59114627) exceeds the configured maximum (52428800)
at org.apache.tomcat.util.http.fileupload.FileUploadBase$FileItemIteratorImpl.<init>(FileUploadBase.java:811)
at org.apache.tomcat.util.http.fileupload.FileUploadBase.getItemIterator(FileUploadBase.java:256)
at org.apache.tomcat.util.http.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:280)
at org.apache.catalina.connector.Request.parseParts(Request.java:2734)
... 28 more
If you encounter the above error, you should change file size in `webapps/manager/WEB-INF/web.xml` appropriately:
<multipart-config>
<!-- 50MB max -->
<max-file-size>52428800</max-file-size>
<max-request-size>52428800</max-request-size>
<file-size-threshold>0</file-size-threshold>
</multipart-config>
Friday, October 23, 2015
How to dump a text file to byte values in hex
To dump a text file to byte values in hex,
use the following command:
xxd test.txt
use the following command:
xxd test.txt
How to list all exported system environment variables in Linux
To list all exported system environment variables in Linux,
use `printenv` or `export -p`.
use `printenv` or `export -p`.
Tuesday, October 6, 2015
Make sure a corresponding org.thymeleaf.doctype.resolution.IDocTypeResolutionEntry implementation is provided by you dialect
When you use Thymeleaf with Spring Boot and use the following DOCTYPE:
<!DOCTYPE html SYSTEM "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-4.dtd">
you will get the following errors:
2015-10-07 10:19:45.957 ERROR 3301 --- [io-18080-exec-1] org.thymeleaf.TemplateEngine : [THYMELEAF][http-nio-18080-exec-1] Exception processing template "restaurants/add": Unsupported entity requested with PUBLICID "null" and SYSTEMID "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-4.dtd". Make sure a corresponding org.thymeleaf.doctype.resolution.IDocTypeResolutionEntry implementation is provided by you dialect
2015-10-07 10:19:45.964 ERROR 3301 --- [io-18080-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.thymeleaf.exceptions.TemplateProcessingException: Unsupported entity requested with PUBLICID "null" and SYSTEMID "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-4.dtd". Make sure a corresponding org.thymeleaf.doctype.resolution.IDocTypeResolutionEntry implementation is provided by you dialect] with root cause
org.thymeleaf.exceptions.TemplateProcessingException: Unsupported entity requested with PUBLICID "null" and SYSTEMID "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-4.dtd". Make sure a corresponding org.thymeleaf.doctype.resolution.IDocTypeResolutionEntry implementation is provided by you dialect
at org.thymeleaf.templateparser.EntityResolver.resolveEntity(EntityResolver.java:75) ~[thymeleaf-2.1.4.RELEASE.jar:2.1.4.RELEASE]
...
You should use the following DOCTYPE:
<!DOCTYPE html SYSTEM "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-spring4-4.dtd">
Reference:
http://stackoverflow.com/questions/25132189/thymleaf-unsupported-entity-requested-with-publicid-null
<!DOCTYPE html SYSTEM "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-4.dtd">
you will get the following errors:
2015-10-07 10:19:45.957 ERROR 3301 --- [io-18080-exec-1] org.thymeleaf.TemplateEngine : [THYMELEAF][http-nio-18080-exec-1] Exception processing template "restaurants/add": Unsupported entity requested with PUBLICID "null" and SYSTEMID "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-4.dtd". Make sure a corresponding org.thymeleaf.doctype.resolution.IDocTypeResolutionEntry implementation is provided by you dialect
2015-10-07 10:19:45.964 ERROR 3301 --- [io-18080-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is org.thymeleaf.exceptions.TemplateProcessingException: Unsupported entity requested with PUBLICID "null" and SYSTEMID "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-4.dtd". Make sure a corresponding org.thymeleaf.doctype.resolution.IDocTypeResolutionEntry implementation is provided by you dialect] with root cause
org.thymeleaf.exceptions.TemplateProcessingException: Unsupported entity requested with PUBLICID "null" and SYSTEMID "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-4.dtd". Make sure a corresponding org.thymeleaf.doctype.resolution.IDocTypeResolutionEntry implementation is provided by you dialect
at org.thymeleaf.templateparser.EntityResolver.resolveEntity(EntityResolver.java:75) ~[thymeleaf-2.1.4.RELEASE.jar:2.1.4.RELEASE]
...
You should use the following DOCTYPE:
<!DOCTYPE html SYSTEM "http://www.thymeleaf.org/dtd/xhtml1-strict-thymeleaf-spring4-4.dtd">
Reference:
http://stackoverflow.com/questions/25132189/thymleaf-unsupported-entity-requested-with-publicid-null
Thursday, October 1, 2015
Kibana doesn't recognize changed `not_analyzed`
When you change index type to `not_analyzed`
and Kibana doesn't recognize `not_analyzed`,
do as follows:
Settings -> Reload field list
and try again.
and Kibana doesn't recognize `not_analyzed`,
do as follows:
Settings -> Reload field list
and try again.
Make a field as `not_analyzed` index in Elasticsearch
To make a field as `not_analyzed` index in Elasticsearch,
use `@Field` as follows:
@Field(type = FieldType.String, index = FieldIndex.not_analyzed)
private String name;
and check as follows:
curl -XGET 'http://localhost:9200/event/_mapping?pretty'
and then you will see as follows:
"name" : {
"type" : "string",
"index" : "not_analyzed"
},
Note that the following annotation doesn't work:
@Field(index = FieldIndex.not_analyzed)
I'm not sure why type inference doesn't work.
use `@Field` as follows:
@Field(type = FieldType.String, index = FieldIndex.not_analyzed)
private String name;
and check as follows:
curl -XGET 'http://localhost:9200/event/_mapping?pretty'
and then you will see as follows:
"name" : {
"type" : "string",
"index" : "not_analyzed"
},
Note that the following annotation doesn't work:
@Field(index = FieldIndex.not_analyzed)
I'm not sure why type inference doesn't work.
Set a `java.util.Date` typed field as `date` type in Elasticsearch
To set a `java.util.Date` typed field as `date` type in Elasticsearch,
use `@Field` as follows:
@Field(type = FieldType.Date)
private Date timestamp;
and check as follows:
curl -XGET 'http://localhost:9200/event/_mapping?pretty'
and then you will see as follows:
"timestamp" : {
"type" : "date",
"format" : "dateOptionalTime"
}
use `@Field` as follows:
@Field(type = FieldType.Date)
private Date timestamp;
and check as follows:
curl -XGET 'http://localhost:9200/event/_mapping?pretty'
and then you will see as follows:
"timestamp" : {
"type" : "date",
"format" : "dateOptionalTime"
}
Wednesday, September 23, 2015
Change encoding for main and test sources in Gradle
compileJava.options.encoding = 'UTF-8'
compileTestJava.options.encoding = 'UTF-8'
Reference:
https://github.com/huxi/sulky/blob/master/build.gradle
compileTestJava.options.encoding = 'UTF-8'
Reference:
https://github.com/huxi/sulky/blob/master/build.gradle
Tuesday, September 22, 2015
QueryDSL + JPA + Gradle in IntelliJ
`build.gradle`:
apply plugin: 'idea'
idea {
module {
sourceDirs += file('generated/')
}
}
dependencies {
...
compile("com.mysema.querydsl:querydsl-jpa:$querydslVersion")
compile("com.mysema.querydsl:querydsl-apt:$querydslVersion:jpa")
...
}
Setup IntelliJ:
File -> Settings...
Compiler -> Annotation Processors
`Enable annotation processing`
Store generated sources relative to: `Module content root`
Reference:
http://bsideup.blogspot.kr/2015/04/querydsl-with-gradle-and-idea.html
apply plugin: 'idea'
idea {
module {
sourceDirs += file('generated/')
}
}
dependencies {
...
compile("com.mysema.querydsl:querydsl-jpa:$querydslVersion")
compile("com.mysema.querydsl:querydsl-apt:$querydslVersion:jpa")
...
}
Setup IntelliJ:
File -> Settings...
Compiler -> Annotation Processors
`Enable annotation processing`
Store generated sources relative to: `Module content root`
Reference:
http://bsideup.blogspot.kr/2015/04/querydsl-with-gradle-and-idea.html
Monday, September 21, 2015
Print errors in Flask
Use debug flag as follows:
app.run(debug=True)
Reference:
http://blog.luisrei.com/articles/flaskrest.html
app.run(debug=True)
Reference:
http://blog.luisrei.com/articles/flaskrest.html
Create an object in Python
You can create an object as follows:
class DiskSpace(object):
def __init__(self, total, free):
self.total = total
self.free = free
def __str__(self):
return "total: %d, free: %d" % (self.total, self.free)
diskSpace = DiskSpace(100, 50)
print diskSpace
References:
http://stackoverflow.com/questions/15081542/python-creating-objects
http://stackoverflow.com/questions/727761/python-str-and-lists
http://www.pythonforbeginners.com/concatenation/string-concatenation-and-formatting-in-python
class DiskSpace(object):
def __init__(self, total, free):
self.total = total
self.free = free
def __str__(self):
return "total: %d, free: %d" % (self.total, self.free)
diskSpace = DiskSpace(100, 50)
print diskSpace
References:
http://stackoverflow.com/questions/15081542/python-creating-objects
http://stackoverflow.com/questions/727761/python-str-and-lists
http://www.pythonforbeginners.com/concatenation/string-concatenation-and-formatting-in-python
`Hello, world!` by Flask in Linux
Install pip if not installed:
sudo yum install python-pip
Install Flask if not installed:
sudo pip install Flask
Don't make your sample filename as `flask.py`.
If you do so, you will get the following error:
Traceback (most recent call last):
File "flask.py", line 1, in <module>
from flask import Flask
File "/home/izeye/workspaces/izeye/python/flask.py", line 1, in <module>
from flask import Flask
ImportError: cannot import name Flask
In `test_flask.py`:
from flask import Flask
app = Flask(__name__)
@app.route("/")
def hello():
return "Hello, world!"
if __name__ == "__main__":
app.run()
You can run it as follows:
python test_flask.py
References:
http://flask.pocoo.org/
http://stackoverflow.com/questions/14792605/python-flask-import-error
sudo yum install python-pip
Install Flask if not installed:
sudo pip install Flask
Don't make your sample filename as `flask.py`.
If you do so, you will get the following error:
Traceback (most recent call last):
File "flask.py", line 1, in <module>
from flask import Flask
File "/home/izeye/workspaces/izeye/python/flask.py", line 1, in <module>
from flask import Flask
ImportError: cannot import name Flask
In `test_flask.py`:
from flask import Flask
app = Flask(__name__)
@app.route("/")
def hello():
return "Hello, world!"
if __name__ == "__main__":
app.run()
You can run it as follows:
python test_flask.py
References:
http://flask.pocoo.org/
http://stackoverflow.com/questions/14792605/python-flask-import-error
How to get disk usage by Python in Linux
To get disk usage,
you can use the following code:
import os
statvfs = os.statvfs('/')
total_disk_space = statvfs.f_frsize * statvfs.f_blocks
free_disk_space = statvfs.f_frsize * statvfs.f_bfree
disk_usage = (total_disk_space - free_disk_space) * 100.0 / total_disk_space
print total_disk_space
print free_disk_space
print disk_usage
Reference:
http://stackoverflow.com/questions/4260116/find-size-and-free-space-of-the-filesystem-containing-a-given-file
you can use the following code:
import os
statvfs = os.statvfs('/')
total_disk_space = statvfs.f_frsize * statvfs.f_blocks
free_disk_space = statvfs.f_frsize * statvfs.f_bfree
disk_usage = (total_disk_space - free_disk_space) * 100.0 / total_disk_space
print total_disk_space
print free_disk_space
print disk_usage
Reference:
http://stackoverflow.com/questions/4260116/find-size-and-free-space-of-the-filesystem-containing-a-given-file
Saturday, September 19, 2015
How to download JDK with `wget`
To download JDK with `wget`,
use the following commnad:
wget --no-check-certificate --header="Cookie: oraclelicense=accept-securebackup-cookie" "http://download.oracle.com/otn-pub/java/jdk/8u60-b27/jdk-8u60-linux-x64.tar.gz"
Reference:
https://gist.github.com/hgomez/4697585
use the following commnad:
wget --no-check-certificate --header="Cookie: oraclelicense=accept-securebackup-cookie" "http://download.oracle.com/otn-pub/java/jdk/8u60-b27/jdk-8u60-linux-x64.tar.gz"
Reference:
https://gist.github.com/hgomez/4697585
Monday, September 7, 2015
How to get 2 days ago in Linux
To get 2 days ago in Linux,
you can do as follows:
date --date="-2day" "+%Y.%m.%d"
Reference:
http://www.cyberciti.biz/tips/linux-unix-get-yesterdays-tomorrows-date.html
you can do as follows:
date --date="-2day" "+%Y.%m.%d"
Reference:
http://www.cyberciti.biz/tips/linux-unix-get-yesterdays-tomorrows-date.html
Delete documents in Elasticsearch
If you want to delete documents of a specific type,
you can delete the documents as follows:
$ curl -XDELETE http://localhost:9200/logstash-2015.09.05/logs
{"acknowledged":true}
$
If you want to delete documents of all types under a specific index,
you can delete the documents as follows:
$ curl -XDELETE http://localhost:9200/logstash-2015.09.05
{"acknowledged":true}
$
References:
http://stackoverflow.com/questions/23917327/delete-all-documents-from-index-type-without-deleting-type
https://www.elastic.co/blog/what-is-an-elasticsearch-index
you can delete the documents as follows:
$ curl -XDELETE http://localhost:9200/logstash-2015.09.05/logs
{"acknowledged":true}
$
If you want to delete documents of all types under a specific index,
you can delete the documents as follows:
$ curl -XDELETE http://localhost:9200/logstash-2015.09.05
{"acknowledged":true}
$
References:
http://stackoverflow.com/questions/23917327/delete-all-documents-from-index-type-without-deleting-type
https://www.elastic.co/blog/what-is-an-elasticsearch-index
How to get the number of documents and the total size in an index in Elasticsearch
To get the number of documents and the total size in an index,
use the following command:
curl -XGET 'http://localhost:9200/logstash-2015.09.05/_stats?pretty'
`logstash-2015.09.05` will be your index.
Reference:
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-stats.html
use the following command:
curl -XGET 'http://localhost:9200/logstash-2015.09.05/_stats?pretty'
`logstash-2015.09.05` will be your index.
Reference:
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-stats.html
How to get all indices in Elasticsearch
To get all indices in Elasticsearch,
use the following command:
curl -XGET 'http://localhost:9200/*?pretty'
Reference:
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-index.html
use the following command:
curl -XGET 'http://localhost:9200/*?pretty'
Reference:
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-index.html
Saturday, September 5, 2015
How to handle nested backticks (`) in Bash
If you use nested backticks (`),
you will get the following errors:
$ echo `printf '%02d' $((10#`date +%M` / 30 * 30))`
-bash: command substitution: line 1: unexpected EOF while looking for matching `)'
-bash: command substitution: line 2: syntax error: unexpected end of file
-bash: command substitution: line 1: syntax error near unexpected token `)'
-bash: command substitution: line 1: ` / 30 * 30))'
date +%M
$
Use $() instead as follows:
$ echo `printf '%02d' $((10#$(date +%M) / 30 * 30))`
00
$
Reference:
http://stackoverflow.com/questions/2657012/how-to-properly-nest-bash-backticks
you will get the following errors:
$ echo `printf '%02d' $((10#`date +%M` / 30 * 30))`
-bash: command substitution: line 1: unexpected EOF while looking for matching `)'
-bash: command substitution: line 2: syntax error: unexpected end of file
-bash: command substitution: line 1: syntax error near unexpected token `)'
-bash: command substitution: line 1: ` / 30 * 30))'
date +%M
$
Use $() instead as follows:
$ echo `printf '%02d' $((10#$(date +%M) / 30 * 30))`
00
$
Reference:
http://stackoverflow.com/questions/2657012/how-to-properly-nest-bash-backticks
Friday, September 4, 2015
How to add a leading zero in Bash
If you want to add a leading zero to a number,
you can do as follows:
$ echo `printf '%02d' 8`
08
$
Reference:
http://stackoverflow.com/questions/55754/bash-script-to-pad-file-names
you can do as follows:
$ echo `printf '%02d' 8`
08
$
Reference:
http://stackoverflow.com/questions/55754/bash-script-to-pad-file-names
How to handle a leading zero in Bash
If you have a leading zero in a number,
the number is handled as octal number as follows:
$ echo $((08 + 1))
-bash: 08: value too great for base (error token is "08")
$
To handle it as decimal number you have to do as follows:
$ echo $((10#08 + 1))
9
$
Reference:
http://blog.famzah.net/2010/08/07/beware-of-leading-zeros-in-bash-numeric-variables/
the number is handled as octal number as follows:
$ echo $((08 + 1))
-bash: 08: value too great for base (error token is "08")
$
To handle it as decimal number you have to do as follows:
$ echo $((10#08 + 1))
9
$
Reference:
http://blog.famzah.net/2010/08/07/beware-of-leading-zeros-in-bash-numeric-variables/
Thursday, September 3, 2015
How to get total size of specific files in Linux
To get total size of specific files in Linux,
you can do as follows:
du -ch debug.log.*
Reference:
http://unix.stackexchange.com/questions/72661/show-sum-of-file-sizes-in-directory-listing
you can do as follows:
du -ch debug.log.*
Reference:
http://unix.stackexchange.com/questions/72661/show-sum-of-file-sizes-in-directory-listing
Wednesday, September 2, 2015
How to import a Gradle project from a Git repository in STS (or Eclipse)
1. Clone the Git repository:
Window -> Show View -> Other...
Git -> Git Repositories
Clone a Git repository -> Clone URI
2. Install Gradle plugin:
Help -> Eclipse Marketplace... -> Gradle Integration for Eclipse 3.7.0.RELEASE
3. Import the project:
Import -> Gradle -> Gradle Project
Window -> Show View -> Other...
Git -> Git Repositories
Clone a Git repository -> Clone URI
2. Install Gradle plugin:
Help -> Eclipse Marketplace... -> Gradle Integration for Eclipse 3.7.0.RELEASE
3. Import the project:
Import -> Gradle -> Gradle Project
Monday, August 24, 2015
How to override Spring Boot application.properties in tests
To override Spring Boot application.properties in tests,
use the following annotations:
@TestPropertySource(properties = {"spring.cache.type=simple"})
@DirtiesContext
References:
http://stackoverflow.com/questions/29669393/override-default-spring-boot-application-properties-settings-in-junit-test
https://github.com/spring-projects/spring-boot/issues/2198
use the following annotations:
@TestPropertySource(properties = {"spring.cache.type=simple"})
@DirtiesContext
References:
http://stackoverflow.com/questions/29669393/override-default-spring-boot-application-properties-settings-in-junit-test
https://github.com/spring-projects/spring-boot/issues/2198
Sunday, August 23, 2015
org.springframework.mail.MailAuthenticationException: Authentication failed; nested exception is javax.mail.AuthenticationFailedException
When you try to send an email with Gmail SMTP,
you might get the following exception:
org.springframework.mail.MailAuthenticationException: Authentication failed; nested exception is javax.mail.AuthenticationFailedException: 534-5.7.14 <https://accounts.google.com/ContinueSignIn?sarp=1&scc=1&plt=AKgnsbuBm
534-5.7.14 7mH3ysGBLpLeQAGDrZrkNi3uUJsTPF6P8pszrRFOKqdKWGKsDpWBkcrvwJC02xOAQsW7b-
...
534-5.7.14 zFvhhWx1SQ-yRuGO0fa8JMMszw3E> Please log in via your web browser and
534-5.7.14 then try again.
534-5.7.14 Learn more at
534 5.7.14 https://support.google.com/mail/answer/78754 z16sm15149096pbt.3 - gsmtp
at org.springframework.mail.javamail.JavaMailSenderImpl.doSend(JavaMailSenderImpl.java:424)
at org.springframework.mail.javamail.JavaMailSenderImpl.send(JavaMailSenderImpl.java:307)
at org.springframework.mail.javamail.JavaMailSenderImpl.send(JavaMailSenderImpl.java:296)
Go to the following URL:
https://www.google.com/settings/security/lesssecureapps
Set `Access for less secure apps` to `Turn on`.
Reference:
http://stackoverflow.com/questions/20337040/gmail-smtp-debug-error-please-log-in-via-your-web-browser
you might get the following exception:
org.springframework.mail.MailAuthenticationException: Authentication failed; nested exception is javax.mail.AuthenticationFailedException: 534-5.7.14 <https://accounts.google.com/ContinueSignIn?sarp=1&scc=1&plt=AKgnsbuBm
534-5.7.14 7mH3ysGBLpLeQAGDrZrkNi3uUJsTPF6P8pszrRFOKqdKWGKsDpWBkcrvwJC02xOAQsW7b-
...
534-5.7.14 zFvhhWx1SQ-yRuGO0fa8JMMszw3E> Please log in via your web browser and
534-5.7.14 then try again.
534-5.7.14 Learn more at
534 5.7.14 https://support.google.com/mail/answer/78754 z16sm15149096pbt.3 - gsmtp
at org.springframework.mail.javamail.JavaMailSenderImpl.doSend(JavaMailSenderImpl.java:424)
at org.springframework.mail.javamail.JavaMailSenderImpl.send(JavaMailSenderImpl.java:307)
at org.springframework.mail.javamail.JavaMailSenderImpl.send(JavaMailSenderImpl.java:296)
Go to the following URL:
https://www.google.com/settings/security/lesssecureapps
Set `Access for less secure apps` to `Turn on`.
Reference:
http://stackoverflow.com/questions/20337040/gmail-smtp-debug-error-please-log-in-via-your-web-browser
java.lang.IllegalArgumentException: Unknown ordinal value [100] for enum class [com.izeye.test.event.domain.EventSource]
If you try to fetch an undefined enum ordinal value in JPA backed by Hibernate,
you will get the following exception:
java.lang.IllegalArgumentException: Unknown ordinal value [100] for enum class [com.izeye.test.event.domain.EventSource]
at org.hibernate.type.EnumType$OrdinalEnumValueMapper.fromOrdinal(EnumType.java:391)
at org.hibernate.type.EnumType$OrdinalEnumValueMapper.getValue(EnumType.java:381)
at org.hibernate.type.EnumType.nullSafeGet(EnumType.java:107)
at org.hibernate.type.CustomType.nullSafeGet(CustomType.java:127)
at org.hibernate.type.AbstractType.hydrate(AbstractType.java:106)
at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:2969)
at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1696)
If you want to avoid the above exception and just get `null` instead,
add the following code:
@Converter(autoApply = true)
public class EventSourceConverter implements AttributeConverter<EventSource, Integer> {
@Override
public Integer convertToDatabaseColumn(EventSource eventSource) {
return eventSource.ordinal();
}
@Override
public EventSource convertToEntityAttribute(Integer ordinal) {
EventSource[] values = EventSource.values();
if (ordinal >= values.length) {
return null;
}
return values[ordinal];
}
}
Reference:
http://www.javacodegeeks.com/2014/05/jpa-2-1-type-converter-the-better-way-to-persist-enums.html
you will get the following exception:
java.lang.IllegalArgumentException: Unknown ordinal value [100] for enum class [com.izeye.test.event.domain.EventSource]
at org.hibernate.type.EnumType$OrdinalEnumValueMapper.fromOrdinal(EnumType.java:391)
at org.hibernate.type.EnumType$OrdinalEnumValueMapper.getValue(EnumType.java:381)
at org.hibernate.type.EnumType.nullSafeGet(EnumType.java:107)
at org.hibernate.type.CustomType.nullSafeGet(CustomType.java:127)
at org.hibernate.type.AbstractType.hydrate(AbstractType.java:106)
at org.hibernate.persister.entity.AbstractEntityPersister.hydrate(AbstractEntityPersister.java:2969)
at org.hibernate.loader.Loader.loadFromResultSet(Loader.java:1696)
If you want to avoid the above exception and just get `null` instead,
add the following code:
@Converter(autoApply = true)
public class EventSourceConverter implements AttributeConverter<EventSource, Integer> {
@Override
public Integer convertToDatabaseColumn(EventSource eventSource) {
return eventSource.ordinal();
}
@Override
public EventSource convertToEntityAttribute(Integer ordinal) {
EventSource[] values = EventSource.values();
if (ordinal >= values.length) {
return null;
}
return values[ordinal];
}
}
Reference:
http://www.javacodegeeks.com/2014/05/jpa-2-1-type-converter-the-better-way-to-persist-enums.html
Saturday, August 15, 2015
org.hibernate.AssertionFailure: possible non-threadsafe access to session
When you are using a sequence for an entity
and persist it, detach it, and merge it in a transaction,
you will get the following error:
ERROR: HHH000099: an assertion failure occured (this may indicate a bug in Hibernate, but is more likely due to unsafe use of the session): org.hibernate.AssertionFailure: possible non-threadsafe access to session
You should flush it before merging it.
Unfortunately I didn't understand the reason.
I'm not sure it's a bug or wrong usage.
It was just a test code and I'm not sure it could be a real scenario or not.
and persist it, detach it, and merge it in a transaction,
you will get the following error:
ERROR: HHH000099: an assertion failure occured (this may indicate a bug in Hibernate, but is more likely due to unsafe use of the session): org.hibernate.AssertionFailure: possible non-threadsafe access to session
You should flush it before merging it.
Unfortunately I didn't understand the reason.
I'm not sure it's a bug or wrong usage.
It was just a test code and I'm not sure it could be a real scenario or not.
This inspection controls whether the Persistence QL Queries are error-checked
If you get the following error in IntelliJ:
Can't resolve symbol 'Person' less... (⌘F1)
This inspection controls whether the Persistence QL Queries are error-checked
do the following:
Inspection 'Query language checks' options -> Disable inspection
Reference:
http://devday.tistory.com/entry/Cant-resolve-symbol-Note
Can't resolve symbol 'Person' less... (⌘F1)
This inspection controls whether the Persistence QL Queries are error-checked
do the following:
Inspection 'Query language checks' options -> Disable inspection
Reference:
http://devday.tistory.com/entry/Cant-resolve-symbol-Note
Friday, August 14, 2015
ERROR: Sequence "HIBERNATE_SEQUENCE" not found
If you have the following with H2:
@Id
@GeneratedValue
private Long id;
you might get the following error:
Hibernate:
call next value for hibernate_sequence
8월 15, 2015 2:49:23 오후 org.hibernate.engine.jdbc.spi.SqlExceptionHelper logExceptions
WARN: SQL Error: 90036, SQLState: 90036
8월 15, 2015 2:49:23 오후 org.hibernate.engine.jdbc.spi.SqlExceptionHelper logExceptions
ERROR: Sequence "HIBERNATE_SEQUENCE" not found; SQL statement:
call next value for hibernate_sequence [90036-187]
ex: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: could not prepare statement
I guess `GenerationType.AUTO` uses `GenerationType.SEQUENCE`
and H2 doesn't support it.
I tried with `GenerationType.IDENTITY` as follows:
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
and it works.
UPDATED:
Actually H2 DOES support `GenerationType.SEQUENCE`
and you should do as follows:
@Entity
@SequenceGenerator(
name = "USER_SEQ_GENERATOR",
sequenceName = "USER_SEQ",
initialValue = 1, allocationSize = 1)
public class User {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "USER_SEQ_GENERATOR")
private Long id;
...
}
@Id
@GeneratedValue
private Long id;
you might get the following error:
Hibernate:
call next value for hibernate_sequence
8월 15, 2015 2:49:23 오후 org.hibernate.engine.jdbc.spi.SqlExceptionHelper logExceptions
WARN: SQL Error: 90036, SQLState: 90036
8월 15, 2015 2:49:23 오후 org.hibernate.engine.jdbc.spi.SqlExceptionHelper logExceptions
ERROR: Sequence "HIBERNATE_SEQUENCE" not found; SQL statement:
call next value for hibernate_sequence [90036-187]
ex: javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: could not prepare statement
I guess `GenerationType.AUTO` uses `GenerationType.SEQUENCE`
and H2 doesn't support it.
I tried with `GenerationType.IDENTITY` as follows:
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
and it works.
UPDATED:
Actually H2 DOES support `GenerationType.SEQUENCE`
and you should do as follows:
@Entity
@SequenceGenerator(
name = "USER_SEQ_GENERATOR",
sequenceName = "USER_SEQ",
initialValue = 1, allocationSize = 1)
public class User {
@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "USER_SEQ_GENERATOR")
private Long id;
...
}
Wednesday, August 12, 2015
How to change a version of dependency managed by Spring Boot in Gradle
To change the following in `spring-boot-dependencies`:
<spring-data-releasetrain.version>Gosling-RC1</spring-data-releasetrain.version>
add the following in `build.gradle`:
ext['spring-data-releasetrain.version'] = 'Gosling-M1'
Reference:
http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#howto-customize-dependency-versions
<spring-data-releasetrain.version>Gosling-RC1</spring-data-releasetrain.version>
add the following in `build.gradle`:
ext['spring-data-releasetrain.version'] = 'Gosling-M1'
Reference:
http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/#howto-customize-dependency-versions
Thursday, July 30, 2015
Exclude specific tasks of a sub-project in Gradle
To exclude specific tasks of a sub-project in Gradle,
do as follows:
gradle build -x :test-common:startScripts -x :test-common:bootRepackage
do as follows:
gradle build -x :test-common:startScripts -x :test-common:bootRepackage
Run a specific task of a sub-project in Gradle
To run a specific task of a sub-project in Gradle,
do as follows:
gradle :test-common:startScripts
do as follows:
gradle :test-common:startScripts
Wednesday, July 29, 2015
No signature of method: build_59tj2ra5d1mn9tk0vrfinvo79c$_run_closure2.id() is applicable for argument types: (java.lang.String) values: [org.asciidoctor.convert]
You might encounter the following exception:
FAILURE: Build failed with an exception.
* Where:
Build file '/home/izeye/applications/test/build.gradle' line: 9
* What went wrong:
A problem occurred evaluating root project 'test'.
> No signature of method: build_59tj2ra5d1mn9tk0vrfinvo79c$_run_closure2.id() is applicable for argument types: (java.lang.String) values: [org.asciidoctor.convert]
Possible solutions: is(java.lang.Object), is(java.lang.Object), find(), find(), find(groovy.lang.Closure), find(groovy.lang.Closure)
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Gradle version was 1.12.
Just upgrading to 2.5 fixed the problem.
FAILURE: Build failed with an exception.
* Where:
Build file '/home/izeye/applications/test/build.gradle' line: 9
* What went wrong:
A problem occurred evaluating root project 'test'.
> No signature of method: build_59tj2ra5d1mn9tk0vrfinvo79c$_run_closure2.id() is applicable for argument types: (java.lang.String) values: [org.asciidoctor.convert]
Possible solutions: is(java.lang.Object), is(java.lang.Object), find(), find(), find(groovy.lang.Closure), find(groovy.lang.Closure)
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
Gradle version was 1.12.
Just upgrading to 2.5 fixed the problem.
Tuesday, July 28, 2015
Error: Invalid value for attribute d="MNaN,82.5LNaN,41.25LNaN,0LNaN,123.75LNaN,330"
You might encounter the following error in D3.js:
Error: Invalid value for <path> attribute d="MNaN,82.5LNaN,41.25LNaN,0LNaN,123.75LNaN,330"
If you get the following date and time format:
"collectedTime" : "2015-05-19 14:48:40",
you need to parse it as follows:
svg.x.domain(d3.extent(data, function (d) {
// return d.collectedTime;
var parsedTime = d3.time.format("%Y-%m-%d %H:%M:%S").parse(d.collectedTime);
return parsedTime;
}));
svg.line = d3.svg.line()
.x(function (d) {
//return svg.x(d.timestamp);
//return svg.x(d.collectedTime);
var parsedTime = d3.time.format("%Y-%m-%d %H:%M:%S").parse(d.collectedTime);
return svg.x(parsedTime);
})
.y(function (d) {
return svg.y(valueFunction(d));
});
References:
http://stackoverflow.com/questions/26070783/invalid-value-for-path-attribute
https://github.com/mbostock/d3/wiki/Time-Formatting
Error: Invalid value for <path> attribute d="MNaN,82.5LNaN,41.25LNaN,0LNaN,123.75LNaN,330"
If you get the following date and time format:
"collectedTime" : "2015-05-19 14:48:40",
you need to parse it as follows:
svg.x.domain(d3.extent(data, function (d) {
// return d.collectedTime;
var parsedTime = d3.time.format("%Y-%m-%d %H:%M:%S").parse(d.collectedTime);
return parsedTime;
}));
svg.line = d3.svg.line()
.x(function (d) {
//return svg.x(d.timestamp);
//return svg.x(d.collectedTime);
var parsedTime = d3.time.format("%Y-%m-%d %H:%M:%S").parse(d.collectedTime);
return svg.x(parsedTime);
})
.y(function (d) {
return svg.y(valueFunction(d));
});
References:
http://stackoverflow.com/questions/26070783/invalid-value-for-path-attribute
https://github.com/mbostock/d3/wiki/Time-Formatting
Monday, July 27, 2015
Get the latest commid ID in Get
To get the latest commid ID in Get,
use the following command:
$ git log --format="%H" -n 1
Reference:
http://stackoverflow.com/questions/19176359/how-to-get-the-last-commit-id-of-a-remote-repo-using-curl-like-command
use the following command:
$ git log --format="%H" -n 1
Reference:
http://stackoverflow.com/questions/19176359/how-to-get-the-last-commit-id-of-a-remote-repo-using-curl-like-command
Git rebase and squash commits
* Check commit logs as follows:
$ git log
commit da274bb605a962fa02ae8fba2da27b934b59d68d
Author: izeye <izeye@naver.com>
Date: Tue Jul 28 11:55:38 2015 +0900
Fix typos.
commit 0e5f855954a69dd05128442bf9c996efc8d35f0d
Author: izeye <izeye@naver.com>
Date: Tue Jul 28 11:47:52 2015 +0900
Fix typos.
commit 32128a6ac2390250d9e0b933618177846fb7bef0
Author: Stephane Nicoll <snicoll@pivotal.io>
Date: Mon Jul 27 16:02:03 2015 +0200
Polish
...
* `git rebase` with the base commit as follows:
$ git rebase -i 32128a6ac2390250d9e0b933618177846fb7bef0
pick 0e5f855 Fix typos.
squash da274bb Fix typos.
* Type `:wq` and change commit messages as follows:
# This is a combination of 2 commits.
# The first commit's message is:
Fix typos.
# This is the 2nd commit message:
#Fix typos.
* Type `:wq` again.
* Check commit logs again as follows:
$ git log
commit d51d6b7e8d21c60a6ac9e6e6e60847c112514998
Author: izeye <izeye@naver.com>
Date: Tue Jul 28 11:47:52 2015 +0900
Fix typos.
commit 32128a6ac2390250d9e0b933618177846fb7bef0
Author: Stephane Nicoll <snicoll@pivotal.io>
Date: Mon Jul 27 16:02:03 2015 +0200
Polish
If you want to push it to the remote repository,
do as follows:
$ git push -f
Reference:
https://github.com/edx/edx-platform/wiki/How-to-Rebase-a-Pull-Request
$ git log
commit da274bb605a962fa02ae8fba2da27b934b59d68d
Author: izeye <izeye@naver.com>
Date: Tue Jul 28 11:55:38 2015 +0900
Fix typos.
commit 0e5f855954a69dd05128442bf9c996efc8d35f0d
Author: izeye <izeye@naver.com>
Date: Tue Jul 28 11:47:52 2015 +0900
Fix typos.
commit 32128a6ac2390250d9e0b933618177846fb7bef0
Author: Stephane Nicoll <snicoll@pivotal.io>
Date: Mon Jul 27 16:02:03 2015 +0200
Polish
...
* `git rebase` with the base commit as follows:
$ git rebase -i 32128a6ac2390250d9e0b933618177846fb7bef0
pick 0e5f855 Fix typos.
squash da274bb Fix typos.
* Type `:wq` and change commit messages as follows:
# This is a combination of 2 commits.
# The first commit's message is:
Fix typos.
# This is the 2nd commit message:
#Fix typos.
* Type `:wq` again.
* Check commit logs again as follows:
$ git log
commit d51d6b7e8d21c60a6ac9e6e6e60847c112514998
Author: izeye <izeye@naver.com>
Date: Tue Jul 28 11:47:52 2015 +0900
Fix typos.
commit 32128a6ac2390250d9e0b933618177846fb7bef0
Author: Stephane Nicoll <snicoll@pivotal.io>
Date: Mon Jul 27 16:02:03 2015 +0200
Polish
If you want to push it to the remote repository,
do as follows:
$ git push -f
Reference:
https://github.com/edx/edx-platform/wiki/How-to-Rebase-a-Pull-Request
Sunday, July 26, 2015
only buildscript {} and other plugins {} script blocks are allowed before plugins {} blocks, no other statements are allowed
You might encounter the following error:
only buildscript {} and other plugins {} script blocks are allowed before plugins {} blocks, no other statements are allowed
Just do the instruction in the error message.
In order words, move `plugin {}` script block on top of the script
except `buildscript {}` and other `plugins {}` script blocks.
only buildscript {} and other plugins {} script blocks are allowed before plugins {} blocks, no other statements are allowed
Just do the instruction in the error message.
In order words, move `plugin {}` script block on top of the script
except `buildscript {}` and other `plugins {}` script blocks.
Change Gradle version in IntelliJ
To change Gradle version in IntelliJ,
do as follows:
File -> Settings -> Gradle -> Gradle home
do as follows:
File -> Settings -> Gradle -> Gradle home
Format `yyyy-MM-dd HH:mm:ss` in Bash
To format `yyyy-MM-dd HH:mm:ss` in Bash,
do as follows:
date +"%F %T"
do as follows:
date +"%F %T"
Saturday, July 25, 2015
Uncaught RangeError: Maximum call stack size exceeded
I got the following error suddenly in Windows Chrome Canary:
Uncaught RangeError: Maximum call stack size exceeded
It didn't happen in Windows Chrome, Mac Chrome, and Mac Chrome Canary.
I guess it's a bug of Windows Chrome Canary.
I'm wondering why this kind of bug is happening.
I guess they have tons of tests.
Although it's quite easily reproducible, why did they miss it?
Uncaught RangeError: Maximum call stack size exceeded
It didn't happen in Windows Chrome, Mac Chrome, and Mac Chrome Canary.
I guess it's a bug of Windows Chrome Canary.
I'm wondering why this kind of bug is happening.
I guess they have tons of tests.
Although it's quite easily reproducible, why did they miss it?
Friday, July 24, 2015
java.lang.NoClassDefFoundError: org/codehaus/groovy/runtime/typehandling/ShortTypeHandling
When I used the following command:
gradle build
I got the following exception:
FAILURE: Build failed with an exception.
* Where:
Build file 'C:\Users\nbp\IdeaProjects\impression-neo\build.gradle' line: 101
* What went wrong:
Execution failed for task ':generateGitProperties'.
> java.lang.NoClassDefFoundError: org/codehaus/groovy/runtime/typehandling/ShortTypeHandling
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
When I checked the Gradle version with the following command:
gradle -v
the result was as follows:
------------------------------------------------------------
Gradle 1.11
------------------------------------------------------------
Build time: 2014-02-11 11:34:39 UTC
Build number: none
Revision: a831fa866d46cbee94e61a09af15f9dd95987421
Groovy: 1.8.6
Ant: Apache Ant(TM) version 1.9.2 compiled on July 8 2013
Ivy: 2.2.0
JVM: 1.8.0_05 (Oracle Corporation 25.5-b02)
OS: Windows 7 6.1 amd64
When I upgraded Gradle to 2.5 as follows:
------------------------------------------------------------
Gradle 2.5
------------------------------------------------------------
Build time: 2015-07-08 07:38:37 UTC
Build number: none
Revision: 093765bccd3ee722ed5310583e5ed140688a8c2b
Groovy: 2.3.10
Ant: Apache Ant(TM) version 1.9.3 compiled on December 23 2013
JVM: 1.8.0_05 (Oracle Corporation 25.5-b02)
OS: Windows 7 6.1 amd64
the problem was solved.
I guess the problem was caused by the old version of Groovy.
gradle build
I got the following exception:
FAILURE: Build failed with an exception.
* Where:
Build file 'C:\Users\nbp\IdeaProjects\impression-neo\build.gradle' line: 101
* What went wrong:
Execution failed for task ':generateGitProperties'.
> java.lang.NoClassDefFoundError: org/codehaus/groovy/runtime/typehandling/ShortTypeHandling
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output.
BUILD FAILED
When I checked the Gradle version with the following command:
gradle -v
the result was as follows:
------------------------------------------------------------
Gradle 1.11
------------------------------------------------------------
Build time: 2014-02-11 11:34:39 UTC
Build number: none
Revision: a831fa866d46cbee94e61a09af15f9dd95987421
Groovy: 1.8.6
Ant: Apache Ant(TM) version 1.9.2 compiled on July 8 2013
Ivy: 2.2.0
JVM: 1.8.0_05 (Oracle Corporation 25.5-b02)
OS: Windows 7 6.1 amd64
When I upgraded Gradle to 2.5 as follows:
------------------------------------------------------------
Gradle 2.5
------------------------------------------------------------
Build time: 2015-07-08 07:38:37 UTC
Build number: none
Revision: 093765bccd3ee722ed5310583e5ed140688a8c2b
Groovy: 2.3.10
Ant: Apache Ant(TM) version 1.9.3 compiled on December 23 2013
JVM: 1.8.0_05 (Oracle Corporation 25.5-b02)
OS: Windows 7 6.1 amd64
the problem was solved.
I guess the problem was caused by the old version of Groovy.
Thursday, July 23, 2015
Use Swagger in Spring Boot
To use Swagger in Spring Boot,
add the following dependencies to `build.gradle`:
compile("io.springfox:springfox-swagger2:2.1.1")
compile("io.springfox:springfox-swagger-ui:2.1.1")
and add `@EnableSwagger2` as follows:
@SpringBootApplication
@EnableSwagger2
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
Open the following URL in your browser:
http://localhost:8080/swagger-ui.html
Reference:
http://springfox.github.io/springfox/docs/current/
add the following dependencies to `build.gradle`:
compile("io.springfox:springfox-swagger2:2.1.1")
compile("io.springfox:springfox-swagger-ui:2.1.1")
and add `@EnableSwagger2` as follows:
@SpringBootApplication
@EnableSwagger2
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
Open the following URL in your browser:
http://localhost:8080/swagger-ui.html
Reference:
http://springfox.github.io/springfox/docs/current/
Monday, July 20, 2015
Spring Data JPA `IN` clause with Spring Data REST
You can do as follows:
http://localhost:8080/api/events/search/findByLevelIn?level=ERROR,WARN
See the following link for the details on Spring Data JPA `IN` clause:
http://izeye.blogspot.kr/2015/07/use-in-caluse-with-spring-data-jpa.html
http://localhost:8080/api/events/search/findByLevelIn?level=ERROR,WARN
See the following link for the details on Spring Data JPA `IN` clause:
http://izeye.blogspot.kr/2015/07/use-in-caluse-with-spring-data-jpa.html
Use `IN` clause with Spring Data JPA
To use `IN` clause with Spring Data JPA,
you can do as follows:
public interface EventRepository extends JpaRepository<Event, Long> {
Page<Event> findByLevelIn(@Param("level") Set<EventLevel> levels, Pageable pageable);
}
You can test as follows:
@Test
public void testFindByLevelIn() {
Set<EventLevel> levels = new HashSet<>();
levels.add(EventLevel.ERROR);
levels.add(EventLevel.WARN);
levels.add(EventLevel.INFO);
Page<Event> events = eventRepository.findByLevelIn(levels, new PageRequest(0, 100));
events.forEach(System.out::println);
}
Reference:
http://stackoverflow.com/questions/18987292/spring-crudrepository-findbyinventoryidslistlong-inventoryidlist-equivalen
you can do as follows:
public interface EventRepository extends JpaRepository<Event, Long> {
Page<Event> findByLevelIn(@Param("level") Set<EventLevel> levels, Pageable pageable);
}
You can test as follows:
@Test
public void testFindByLevelIn() {
Set<EventLevel> levels = new HashSet<>();
levels.add(EventLevel.ERROR);
levels.add(EventLevel.WARN);
levels.add(EventLevel.INFO);
Page<Event> events = eventRepository.findByLevelIn(levels, new PageRequest(0, 100));
events.forEach(System.out::println);
}
Reference:
http://stackoverflow.com/questions/18987292/spring-crudrepository-findbyinventoryidslistlong-inventoryidlist-equivalen
sun.net.util.IPAddressUtil vs. org.apache.commons.validator.routines.InetAddressValidator
There are two IP address validation utils:
sun.net.util.IPAddressUtil
org.apache.commons.validator.routines.InetAddressValidator
I made some tests for both.
`IPAddressUtil` tests:
// NOTE:
// Got the following warning in Travis:
// warning: IPAddressUtil is internal proprietary API and may be removed in a future release
public class IPAddressUtilTests {
@Rule
public ExpectedException thrown = ExpectedException.none();
@Test
public void testIsIPv4LiteralAddress() {
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2.3.0"));
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2.3.4"));
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2.3.255"));
assertFalse(IPAddressUtil.isIPv4LiteralAddress("1.2.3.256"));
assertFalse(IPAddressUtil.isIPv4LiteralAddress("1.2.3."));
// NOTE: They look weird but valid.
// See http://stackoverflow.com/questions/7550806/check-valid-ipv4-address-in-java
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2.3"));
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2"));
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1"));
assertFalse(IPAddressUtil.isIPv4LiteralAddress(""));
assertFalse(IPAddressUtil.isIPv4LiteralAddress("::1"));
thrown.expect(NullPointerException.class);
IPAddressUtil.isIPv4LiteralAddress(null);
}
@Test
public void testIsIPv6LiteralAddress() {
assertTrue(IPAddressUtil.isIPv6LiteralAddress("::1"));
assertFalse(IPAddressUtil.isIPv6LiteralAddress("1.2.3.4"));
}
}
`InetAddressValidator` tests:
public class InetAddressValidatorTests {
InetAddressValidator validator = InetAddressValidator.getInstance();
@Test
public void testIsValidInet4Address() {
assertTrue(validator.isValidInet4Address("1.2.3.0"));
assertTrue(validator.isValidInet4Address("1.2.3.4"));
assertTrue(validator.isValidInet4Address("1.2.3.255"));
assertFalse(validator.isValidInet4Address("1.2.3.256"));
assertFalse(validator.isValidInet4Address("1.2.3."));
assertFalse(validator.isValidInet4Address("1.2.3"));
assertFalse(validator.isValidInet4Address("1.2"));
assertFalse(validator.isValidInet4Address("1"));
assertFalse(validator.isValidInet4Address(""));
assertFalse(validator.isValidInet4Address("::1"));
assertFalse(validator.isValidInet4Address(null));
}
@Test
public void testIsValidInet6Address() {
assertTrue(validator.isValidInet6Address("::1"));
assertFalse(validator.isValidInet6Address("1.2.3.4"));
}
}
As you can see the above tests,
they work differently.
Don't use `IPAddressUtil`.
I got the following warning from Travis builds when I used it:
warning: IPAddressUtil is internal proprietary API and may be removed in a future release
sun.net.util.IPAddressUtil
org.apache.commons.validator.routines.InetAddressValidator
I made some tests for both.
`IPAddressUtil` tests:
// NOTE:
// Got the following warning in Travis:
// warning: IPAddressUtil is internal proprietary API and may be removed in a future release
public class IPAddressUtilTests {
@Rule
public ExpectedException thrown = ExpectedException.none();
@Test
public void testIsIPv4LiteralAddress() {
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2.3.0"));
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2.3.4"));
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2.3.255"));
assertFalse(IPAddressUtil.isIPv4LiteralAddress("1.2.3.256"));
assertFalse(IPAddressUtil.isIPv4LiteralAddress("1.2.3."));
// NOTE: They look weird but valid.
// See http://stackoverflow.com/questions/7550806/check-valid-ipv4-address-in-java
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2.3"));
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1.2"));
assertTrue(IPAddressUtil.isIPv4LiteralAddress("1"));
assertFalse(IPAddressUtil.isIPv4LiteralAddress(""));
assertFalse(IPAddressUtil.isIPv4LiteralAddress("::1"));
thrown.expect(NullPointerException.class);
IPAddressUtil.isIPv4LiteralAddress(null);
}
@Test
public void testIsIPv6LiteralAddress() {
assertTrue(IPAddressUtil.isIPv6LiteralAddress("::1"));
assertFalse(IPAddressUtil.isIPv6LiteralAddress("1.2.3.4"));
}
}
`InetAddressValidator` tests:
public class InetAddressValidatorTests {
InetAddressValidator validator = InetAddressValidator.getInstance();
@Test
public void testIsValidInet4Address() {
assertTrue(validator.isValidInet4Address("1.2.3.0"));
assertTrue(validator.isValidInet4Address("1.2.3.4"));
assertTrue(validator.isValidInet4Address("1.2.3.255"));
assertFalse(validator.isValidInet4Address("1.2.3.256"));
assertFalse(validator.isValidInet4Address("1.2.3."));
assertFalse(validator.isValidInet4Address("1.2.3"));
assertFalse(validator.isValidInet4Address("1.2"));
assertFalse(validator.isValidInet4Address("1"));
assertFalse(validator.isValidInet4Address(""));
assertFalse(validator.isValidInet4Address("::1"));
assertFalse(validator.isValidInet4Address(null));
}
@Test
public void testIsValidInet6Address() {
assertTrue(validator.isValidInet6Address("::1"));
assertFalse(validator.isValidInet6Address("1.2.3.4"));
}
}
As you can see the above tests,
they work differently.
Don't use `IPAddressUtil`.
I got the following warning from Travis builds when I used it:
warning: IPAddressUtil is internal proprietary API and may be removed in a future release
Saturday, July 18, 2015
Reference AngularJS's `$scope` in Chrome's Console
To reference AngularJS's `$scope.events` in Chrome's Console,
select the target element in `Elements` tab and use the following in `Console` tab:
angular.element($0).scope().events
Reference:
http://stackoverflow.com/questions/13743058/how-to-access-the-angular-scope-variable-in-browsers-console
select the target element in `Elements` tab and use the following in `Console` tab:
angular.element($0).scope().events
Reference:
http://stackoverflow.com/questions/13743058/how-to-access-the-angular-scope-variable-in-browsers-console
Friday, July 17, 2015
Run a specifc test in Gradle
To run a specific test in Gradle,
you can use the following command:
gradle test --tests *SomeTests.test
Reference:
http://stackoverflow.com/questions/22505533/how-to-run-a-one-test-class-only-on-gradle
you can use the following command:
gradle test --tests *SomeTests.test
Reference:
http://stackoverflow.com/questions/22505533/how-to-run-a-one-test-class-only-on-gradle
Spring Data REST max page size 1000
When I tried to fetch 2000 entries in Spring Data REST,
I got only 1000 entries.
I didn't know the reason until I saw the following source:
https://github.com/spring-projects/spring-data-rest/blob/master/spring-data-rest-core/src/main/java/org/springframework/data/rest/core/config/RepositoryRestConfiguration.java
It is used in `HateoasPageableHandlerMethodArgumentResolver.enhance()` as follows:
builder.replaceQueryParam(sizePropertyName, pageable.getPageSize() <= getMaxPageSize() ? pageable.getPageSize()
: getMaxPageSize());
If you use Spring Boot and want to change the default value,
you can add the following property to `application.properties`:
spring.data.rest.max-page-size=10000
I didn't find any related documentation except the source.
In my case, it would be nice to have a warning when exceeding the max page size,
but in some cases, it could be irritating, I guess.
I got only 1000 entries.
I didn't know the reason until I saw the following source:
https://github.com/spring-projects/spring-data-rest/blob/master/spring-data-rest-core/src/main/java/org/springframework/data/rest/core/config/RepositoryRestConfiguration.java
It is used in `HateoasPageableHandlerMethodArgumentResolver.enhance()` as follows:
builder.replaceQueryParam(sizePropertyName, pageable.getPageSize() <= getMaxPageSize() ? pageable.getPageSize()
: getMaxPageSize());
If you use Spring Boot and want to change the default value,
you can add the following property to `application.properties`:
spring.data.rest.max-page-size=10000
I didn't find any related documentation except the source.
In my case, it would be nice to have a warning when exceeding the max page size,
but in some cases, it could be irritating, I guess.
Thursday, July 16, 2015
Apply `WebJarsResourceResolver` in Spring Boot
To apply `WebJarsResourceResolver` in Spring Boot,
add the following property to `application.properties`:
spring.resources.chain.enabled:true
and add the following dependency to `build.gradle`:
compile("org.webjars:webjars-locator:0.26")
compile("org.webjars:jquery:2.1.4")
Now you can use jQuery as follows in Thymeleaf templates:
<script src="/webjars/jquery/jquery.min.js"></script>
add the following property to `application.properties`:
spring.resources.chain.enabled:true
and add the following dependency to `build.gradle`:
compile("org.webjars:webjars-locator:0.26")
compile("org.webjars:jquery:2.1.4")
Now you can use jQuery as follows in Thymeleaf templates:
<script src="/webjars/jquery/jquery.min.js"></script>
How to exclude a file when publishing an artifact to Maven repository in Gradle
To exclude a `logback.xml` file when publishing an artifact to Maven repository in Gradle,
you can use the following configuration:
ext {
artifactId = project.name
artifactVersion = project.version
}
jar {
baseName = artifactId
version = artifactVersion
}
task commonJar(type: Jar) {
baseName "test-common"
from sourceSets.main.output
exclude 'logback.xml'
}
publishing {
repositories {
maven {
credentials {
username "admin"
password "1234"
}
url "http://repo.test.com:8080/repository/internal"
}
}
publications {
maven(MavenPublication) {
groupId 'com.test'
artifactId artifactId
version artifactVersion
artifact commonJar
}
}
}
Reference:
https://docs.gradle.org/current/userguide/publishing_maven.html
you can use the following configuration:
ext {
artifactId = project.name
artifactVersion = project.version
}
jar {
baseName = artifactId
version = artifactVersion
}
task commonJar(type: Jar) {
baseName "test-common"
from sourceSets.main.output
exclude 'logback.xml'
}
publishing {
repositories {
maven {
credentials {
username "admin"
password "1234"
}
url "http://repo.test.com:8080/repository/internal"
}
}
publications {
maven(MavenPublication) {
groupId 'com.test'
artifactId artifactId
version artifactVersion
artifact commonJar
}
}
}
Reference:
https://docs.gradle.org/current/userguide/publishing_maven.html
Tuesday, July 14, 2015
Check an environment variable having Spring profiles if a specific Spring profile is activated by a Bash script
When an environment variable having Spring profiles is as follows:
SPRING_PROFILE=test,something-else
you can check if `test` Spring profile is activated by using the following Bash script:
spring_active_profiles=(${SPRING_PROFILE//,/ })
spring_profile_test_activated=false
for spring_active_profile in ${spring_active_profiles[@]}
do
if [ $spring_active_profile == "test" ]
then
spring_profile_test_activated=true
fi
done
echo $spring_profile_test_activated
if [ "$spring_profile_test_activated" = true ]
then
echo "Spring profile 'test' is activated."
fi
SPRING_PROFILE=test,something-else
you can check if `test` Spring profile is activated by using the following Bash script:
spring_active_profiles=(${SPRING_PROFILE//,/ })
spring_profile_test_activated=false
for spring_active_profile in ${spring_active_profiles[@]}
do
if [ $spring_active_profile == "test" ]
then
spring_profile_test_activated=true
fi
done
echo $spring_profile_test_activated
if [ "$spring_profile_test_activated" = true ]
then
echo "Spring profile 'test' is activated."
fi
Saturday, July 4, 2015
Configure Graphite with Dropwizard Metrics in Spring Boot
To configure Graphite with Dropwizard Metrics in Spring Boot,
add the following Java Config:
@Configuration
public class GraphiteConfig {
@Autowired
private MetricRegistry metricRegistry;
@Value("${graphite.host}")
private String graphiteHost;
@Value("${graphite.port}")
private int graphitePort;
@PostConstruct
public void initialize() {
Graphite graphite = new Graphite(this.graphiteHost, this.graphitePort);
GraphiteReporter reporter = GraphiteReporter.forRegistry(this.metricRegistry)
.prefixedWith(NetworkUtils.HOSTNAME.replace(".", "_"))
.convertRatesTo(TimeUnit.SECONDS)
.convertDurationsTo(TimeUnit.MILLISECONDS)
.filter(MetricFilter.ALL).build(graphite);
reporter.start(1, TimeUnit.MINUTES);
}
}
and add the following properties:
graphite.host=1.2.3.4
graphite.port=2003
Reference:
https://dropwizard.github.io/metrics/3.1.0/manual/graphite/
add the following Java Config:
@Configuration
public class GraphiteConfig {
@Autowired
private MetricRegistry metricRegistry;
@Value("${graphite.host}")
private String graphiteHost;
@Value("${graphite.port}")
private int graphitePort;
@PostConstruct
public void initialize() {
Graphite graphite = new Graphite(this.graphiteHost, this.graphitePort);
GraphiteReporter reporter = GraphiteReporter.forRegistry(this.metricRegistry)
.prefixedWith(NetworkUtils.HOSTNAME.replace(".", "_"))
.convertRatesTo(TimeUnit.SECONDS)
.convertDurationsTo(TimeUnit.MILLISECONDS)
.filter(MetricFilter.ALL).build(graphite);
reporter.start(1, TimeUnit.MINUTES);
}
}
and add the following properties:
graphite.host=1.2.3.4
graphite.port=2003
Reference:
https://dropwizard.github.io/metrics/3.1.0/manual/graphite/
Install Graphite with Python 2.6.6 on CentOS 6.6
If you don't have pip, install it as follows:
sudo yum install python-pip
# Install Django
sudo pip install 'https://www.djangoproject.com/download/1.4.20/tarball/'
# Install django-tagging
sudo easy_install django-tagging==0.3.1
# Install Twisted
sudo pip install twisted
# Install pytz
sudo pip install pytz
# Install bitmap-fonts-compat
sudo yum install bitmap-fonts-compat
# Install Apache
sudo yum install httpd
# Install mod_wsgi
sudo yum install mod_wsgi
sudo vi /etc/httpd/conf.d/wsgi.conf
#LoadModule wsgi_module modules/mod_wsgi.so
# Install Graphite
sudo pip install https://github.com/graphite-project/ceres/tarball/master
sudo pip install whisper
sudo pip install carbon
sudo pip install graphite-web
# Configure Carbon
cd /opt/graphite/conf
sudo cp carbon.conf.example carbon.conf
sudo cp storage-schemas.conf.example storage-schemas.conf
# Run Carbon
sudo /opt/graphite/bin/carbon-cache.py start
tail -F /opt/graphite/storage/log/carbon-cache/carbon-cache-a/console.log
# Test Carbon
echo "local.random.diceroll 4 `date +%s`" | nc localhost 2003
tail -F /opt/graphite/storage/log/carbon-cache/carbon-cache-a/creates.log
05/07/2015 14:22:22 :: new metric local.random.diceroll matched schema default_1min_for_1day
05/07/2015 14:22:22 :: new metric local.random.diceroll matched aggregation schema default
05/07/2015 14:22:22 :: creating database file /opt/graphite/storage/whisper/local/random/diceroll.wsp (archive=[(60, 1440)] xff=None agg=None)
# Configure Graphite Webapp
sudo cp /opt/graphite/examples/example-graphite-vhost.conf /etc/httpd/conf.d/graphite.conf
sudo cp /opt/graphite/conf/graphite.wsgi.example /opt/graphite/conf/graphite.wsgi
sudo cp /opt/graphite/webapp/graphite/local_settings.py.example /opt/graphite/webapp/graphite/local_settings.py
sudo chown apache:apache /opt/graphite/storage/
sudo chown apache:apache /opt/graphite/storage/log/webapp/
sudo python /opt/graphite/webapp/graphite/manage.py syncdb
# Start Apache
sudo /sbin/service httpd start
tail -F /opt/graphite/storage/log/webapp/error.log
# Visit Graphite Webapp Using Web Browser
http://your-ip-address/
References:
http://graphite.readthedocs.org/en/latest/install.html
http://graphite.readthedocs.org/en/latest/install-pip.html
http://graphite.readthedocs.org/en/latest/config-carbon.html
http://graphite.readthedocs.org/en/latest/admin-carbon.html
https://gist.github.com/ashrithr/9224450
sudo yum install python-pip
# Install Django
sudo pip install 'https://www.djangoproject.com/download/1.4.20/tarball/'
# Install django-tagging
sudo easy_install django-tagging==0.3.1
# Install Twisted
sudo pip install twisted
# Install pytz
sudo pip install pytz
# Install bitmap-fonts-compat
sudo yum install bitmap-fonts-compat
# Install Apache
sudo yum install httpd
# Install mod_wsgi
sudo yum install mod_wsgi
sudo vi /etc/httpd/conf.d/wsgi.conf
#LoadModule wsgi_module modules/mod_wsgi.so
# Install Graphite
sudo pip install https://github.com/graphite-project/ceres/tarball/master
sudo pip install whisper
sudo pip install carbon
sudo pip install graphite-web
# Configure Carbon
cd /opt/graphite/conf
sudo cp carbon.conf.example carbon.conf
sudo cp storage-schemas.conf.example storage-schemas.conf
# Run Carbon
sudo /opt/graphite/bin/carbon-cache.py start
tail -F /opt/graphite/storage/log/carbon-cache/carbon-cache-a/console.log
# Test Carbon
echo "local.random.diceroll 4 `date +%s`" | nc localhost 2003
tail -F /opt/graphite/storage/log/carbon-cache/carbon-cache-a/creates.log
05/07/2015 14:22:22 :: new metric local.random.diceroll matched schema default_1min_for_1day
05/07/2015 14:22:22 :: new metric local.random.diceroll matched aggregation schema default
05/07/2015 14:22:22 :: creating database file /opt/graphite/storage/whisper/local/random/diceroll.wsp (archive=[(60, 1440)] xff=None agg=None)
# Configure Graphite Webapp
sudo cp /opt/graphite/examples/example-graphite-vhost.conf /etc/httpd/conf.d/graphite.conf
sudo cp /opt/graphite/conf/graphite.wsgi.example /opt/graphite/conf/graphite.wsgi
sudo cp /opt/graphite/webapp/graphite/local_settings.py.example /opt/graphite/webapp/graphite/local_settings.py
sudo chown apache:apache /opt/graphite/storage/
sudo chown apache:apache /opt/graphite/storage/log/webapp/
sudo python /opt/graphite/webapp/graphite/manage.py syncdb
# Start Apache
sudo /sbin/service httpd start
tail -F /opt/graphite/storage/log/webapp/error.log
# Visit Graphite Webapp Using Web Browser
http://your-ip-address/
References:
http://graphite.readthedocs.org/en/latest/install.html
http://graphite.readthedocs.org/en/latest/install-pip.html
http://graphite.readthedocs.org/en/latest/config-carbon.html
http://graphite.readthedocs.org/en/latest/admin-carbon.html
https://gist.github.com/ashrithr/9224450
Friday, July 3, 2015
Install a specific version of package by easy_install
To install a specific version of package by easy_install,
you can use the following command:
sudo easy_install django-tagging==0.3.1
Reference:
https://gist.github.com/ashrithr/9224450
you can use the following command:
sudo easy_install django-tagging==0.3.1
Reference:
https://gist.github.com/ashrithr/9224450
Show installed packages by pip
If you want to know whether `tagging` package is installed,
you can use the following command:
$ pip freeze | grep tagging
django-tagging==0.4
tagging==0.2.1
$
Reference:
https://pip.pypa.io/en/latest/reference/pip_freeze.html
you can use the following command:
$ pip freeze | grep tagging
django-tagging==0.4
tagging==0.2.1
$
Reference:
https://pip.pypa.io/en/latest/reference/pip_freeze.html
Show the version of an installed package by pip
If you want to know the version of an installed package (say, `django-tagging`) by pip,
use the following command:
$ pip show django-tagging
---
Name: django-tagging
Version: 0.4
Location: /usr/lib/python2.6/site-packages
Requires:
$
Reference:
http://stackoverflow.com/questions/10214827/find-which-version-of-package-is-installed-with-pip
use the following command:
$ pip show django-tagging
---
Name: django-tagging
Version: 0.4
Location: /usr/lib/python2.6/site-packages
Requires:
$
Reference:
http://stackoverflow.com/questions/10214827/find-which-version-of-package-is-installed-with-pip
Show installed packages by yum
If you want to know whether `python-devel` package is installed,
you can use the following command:
$ yum list installed | grep python-devel
python-devel.x86_64 2.6.6-52.el6 @update
$
Reference:
http://www.electrictoolbox.com/yum-list-installed-packages/
you can use the following command:
$ yum list installed | grep python-devel
python-devel.x86_64 2.6.6-52.el6 @update
$
Reference:
http://www.electrictoolbox.com/yum-list-installed-packages/
Install Django with Python 2.6.6
If you try the following command:
sudo pip install django
you might get some error.
With Python 2.6.6, you can use only 1.4 version of Django.
So you have to use the following command to install:
sudo pip install 'https://www.djangoproject.com/download/1.4.20/tarball/'
You can check if it's working as follows:
$ python
...
>>> import django
>>> print(django.get_version())
1.4.20
>>>
If you want to use greater than 1.4 version,
you should upgrade Python to at least 2.7.
References:
https://docs.djangoproject.com/en/1.8/intro/install/
https://docs.djangoproject.com/en/1.8/faq/install/
https://www.djangoproject.com/download/
sudo pip install django
you might get some error.
With Python 2.6.6, you can use only 1.4 version of Django.
So you have to use the following command to install:
sudo pip install 'https://www.djangoproject.com/download/1.4.20/tarball/'
You can check if it's working as follows:
$ python
...
>>> import django
>>> print(django.get_version())
1.4.20
>>>
If you want to use greater than 1.4 version,
you should upgrade Python to at least 2.7.
References:
https://docs.djangoproject.com/en/1.8/intro/install/
https://docs.djangoproject.com/en/1.8/faq/install/
https://www.djangoproject.com/download/
[warn] module wsgi_module is already loaded, skipping
When you did the following for Graphite,
sudo cp /opt/graphite/examples/example-graphite-vhost.conf /etc/httpd/conf.d/graphite.conf
you will get the following warning:
httpd () : [Fri Jul 03 19:16:29 2015] [warn] module wsgi_module is already loaded, skipping
You should comment out the original WSGI configuration as follows:
sudo vi /etc/httpd/conf.d/wsgi.conf
#LoadModule wsgi_module modules/mod_wsgi.so
Reference:
http://stackoverflow.com/questions/12120057/module-wsgi-module-is-already-loaded-skipping
sudo cp /opt/graphite/examples/example-graphite-vhost.conf /etc/httpd/conf.d/graphite.conf
you will get the following warning:
httpd () : [Fri Jul 03 19:16:29 2015] [warn] module wsgi_module is already loaded, skipping
You should comment out the original WSGI configuration as follows:
sudo vi /etc/httpd/conf.d/wsgi.conf
#LoadModule wsgi_module modules/mod_wsgi.so
Reference:
http://stackoverflow.com/questions/12120057/module-wsgi-module-is-already-loaded-skipping
Apache error log location in CentOS
In CentOS, you can find Apache error log in the following location:
/var/log/httpd/error_log
Reference:
http://lists.centos.org/pipermail/centos/2008-January/049950.html
/var/log/httpd/error_log
Reference:
http://lists.centos.org/pipermail/centos/2008-January/049950.html
Install pip in CentOS
To install pip in CentOS, use the following command:
sudo yum install python-pip
Reference:
https://pip.pypa.io/en/latest/installing.html
sudo yum install python-pip
Reference:
https://pip.pypa.io/en/latest/installing.html
Wednesday, July 1, 2015
Read `Map` by `ObjectMapper.readValue()`
If you try to do the following:
Map<String, String> metricsMap = objectMapper.readValue(response, Map.class);
you might encounter the following exception:
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
You can solve the problem by using `TypeReference` as follows:
private final TypeReference<Map<String, String>> metricsType
= new TypeReference<Map<String, String>>() {};
Map<String, String> metricsMap = objectMapper.readValue(response, metricsType);
Map<String, String> metricsMap = objectMapper.readValue(response, Map.class);
you might encounter the following exception:
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
You can solve the problem by using `TypeReference` as follows:
private final TypeReference<Map<String, String>> metricsType
= new TypeReference<Map<String, String>>() {};
Map<String, String> metricsMap = objectMapper.readValue(response, metricsType);
Tuesday, June 30, 2015
`spring.data.rest.base-uri` doesn't work
After Spring Boot 1.3.0.M1, the following property doesn't work:
spring.data.rest.base-uri=/api
You have to use the following property:
spring.data.rest.base-path=/api
This is corresponding to Spring Data REST change.
spring.data.rest.base-uri=/api
You have to use the following property:
spring.data.rest.base-path=/api
This is corresponding to Spring Data REST change.
Monday, June 29, 2015
Handle keys having dots in JsonPath
When you handle keys having dots in JsonPath,
you can use one of the followings:
// NOTE: 1L doesn't work. Why?
assertThat(JsonPath.<Long> read(metrics, "['counter.test']"), is(1));
assertThat(JsonPath.<Long> read(metrics, "$['counter.test']"), is(1));
assertThat(JsonPath.<Long> read(metrics, "$.['counter.test']"), is(1));
I'm not sure why `<Long>` doesn't work as expected.
It's converted into `Integer`.
You can see a sample in the following:
https://github.com/izeye/samples-spring-boot-branches/blob/rest-and-actuator/src/test/java/samples/springboot/counter/web/CounterControllerTests.java
Reference:
http://stackoverflow.com/questions/19726859/jsonpath-junit-escape-character-for-dots
you can use one of the followings:
// NOTE: 1L doesn't work. Why?
assertThat(JsonPath.<Long> read(metrics, "['counter.test']"), is(1));
assertThat(JsonPath.<Long> read(metrics, "$['counter.test']"), is(1));
assertThat(JsonPath.<Long> read(metrics, "$.['counter.test']"), is(1));
I'm not sure why `<Long>` doesn't work as expected.
It's converted into `Integer`.
You can see a sample in the following:
https://github.com/izeye/samples-spring-boot-branches/blob/rest-and-actuator/src/test/java/samples/springboot/counter/web/CounterControllerTests.java
Reference:
http://stackoverflow.com/questions/19726859/jsonpath-junit-escape-character-for-dots
Get detailed stack traces of failed tests in Gradle
When a test failed in Gradle,
you will get the following stack trace:
com.izeye.test.SomeTests > test FAILED
java.lang.IllegalStateException
Caused by: org.springframework.beans.factory.BeanCreationException
Caused by: java.lang.NoClassDefFoundError
Caused by: java.lang.ClassNotFoundException
But it's sometimes not useful.
When you want to get a detailed stack trace,
add the following configuration:
test {
testLogging {
exceptionFormat = 'full'
}
}
Now you will get the following output:
com.izeye.test.SomeTests > test FAILED
java.lang.IllegalStateException: Failed to load ApplicationContext
Caused by:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sqlSessionFactory' defined in class path resource [com/izeye/test/config/PersistenceConfig.class]: Invocation of init method failed; nested exception is java.lang.NoClassDefF
oundError: org/w3c/dom/ElementTraversal
Caused by:
java.lang.NoClassDefFoundError: org/w3c/dom/ElementTraversal
Caused by:
java.lang.ClassNotFoundException: org.w3c.dom.ElementTraversal
Now you can see that the culprit was `org.w3c.dom.ElementTraversal `.
Reference:
http://java.dzone.com/articles/gradle-goodness-show-more
you will get the following stack trace:
com.izeye.test.SomeTests > test FAILED
java.lang.IllegalStateException
Caused by: org.springframework.beans.factory.BeanCreationException
Caused by: java.lang.NoClassDefFoundError
Caused by: java.lang.ClassNotFoundException
But it's sometimes not useful.
When you want to get a detailed stack trace,
add the following configuration:
test {
testLogging {
exceptionFormat = 'full'
}
}
Now you will get the following output:
com.izeye.test.SomeTests > test FAILED
java.lang.IllegalStateException: Failed to load ApplicationContext
Caused by:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sqlSessionFactory' defined in class path resource [com/izeye/test/config/PersistenceConfig.class]: Invocation of init method failed; nested exception is java.lang.NoClassDefF
oundError: org/w3c/dom/ElementTraversal
Caused by:
java.lang.NoClassDefFoundError: org/w3c/dom/ElementTraversal
Caused by:
java.lang.ClassNotFoundException: org.w3c.dom.ElementTraversal
Now you can see that the culprit was `org.w3c.dom.ElementTraversal `.
Reference:
http://java.dzone.com/articles/gradle-goodness-show-more
Caused by: java.lang.NoSuchFieldException: $jacocoAccess
When you run the following command:
gradle clean test
you might get the following exception:
Caused by: java.lang.NoSuchFieldException: $jacocoAccess
JaCoCo default version (0.6.2.201302030002) doesn't work with Java 8.
You can check your JaCoCo version with the following configuration:
jacoco {
println toolVersion
}
and you will get the following output:
0.6.2.201302030002
You can fix with the following configuration:
jacoco {
toolVersion = "0.7.1.201405082137"
}
Reference:
https://github.com/jacoco/jacoco/issues/74
gradle clean test
you might get the following exception:
Caused by: java.lang.NoSuchFieldException: $jacocoAccess
JaCoCo default version (0.6.2.201302030002) doesn't work with Java 8.
You can check your JaCoCo version with the following configuration:
jacoco {
println toolVersion
}
and you will get the following output:
0.6.2.201302030002
You can fix with the following configuration:
jacoco {
toolVersion = "0.7.1.201405082137"
}
Reference:
https://github.com/jacoco/jacoco/issues/74
Missing `datasource.xxx.active` and `datasource.xxx.usage` in `metrics` endpoint in Spring Boot Actuator
In Spring Boot Actuator's `metrics` endpoint,
if you can't see `datasource.xxx.active` and `datasource.xxx.usage`,
you might have the following configuration:
@SpringBootApplication(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class
})
Showing the metrics needs `DataSourcePoolMetadataProvidersConfiguration`
which is included in `DataSourceAutoConfiguration`.
So you have to use the following if you don't use `DataSourceAutoConfiguration`:
@Import(DataSourcePoolMetadataProvidersConfiguration.class)
It would be nice to separate them because `DataSourceAutoConfiguration` doesn't work
when having multiple `DataSource`s.
if you can't see `datasource.xxx.active` and `datasource.xxx.usage`,
you might have the following configuration:
@SpringBootApplication(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class
})
Showing the metrics needs `DataSourcePoolMetadataProvidersConfiguration`
which is included in `DataSourceAutoConfiguration`.
So you have to use the following if you don't use `DataSourceAutoConfiguration`:
@Import(DataSourcePoolMetadataProvidersConfiguration.class)
It would be nice to separate them because `DataSourceAutoConfiguration` doesn't work
when having multiple `DataSource`s.
Friday, June 26, 2015
`java.lang.StackOverflowError` in Spring Data MongoDB
You might encounter the following error suddenly:
java.lang.StackOverflowError
at java.lang.String.<init>(String.java:201)
at java.lang.String.substring(String.java:1956)
at sun.reflect.misc.ReflectUtil.isNonPublicProxyClass(ReflectUtil.java:288)
at sun.reflect.misc.ReflectUtil.checkPackageAccess(ReflectUtil.java:165)
at sun.reflect.misc.ReflectUtil.isPackageAccessible(ReflectUtil.java:195)
at java.beans.MethodRef.get(MethodRef.java:72)
at java.beans.PropertyDescriptor.getReadMethod(PropertyDescriptor.java:206)
at org.springframework.data.mapping.model.AbstractPersistentProperty.getGetter(AbstractPersistentProperty.java:159)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:127)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:100)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:419)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:412)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:307)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:412)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:386)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeCollectionInternal(MappingMongoConverter.java:622)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.createCollection(MappingMongoConverter.java:546)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:457)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:424)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:412)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:307)
I didn't dig into the problem due to lack of time
but the culprit was `Logger` in a MongoDB document as follows:
private final Logger log = LoggerFactory.getLogger(getClass());
It could be fixed with `org.springframework.data.annotation.Transient` as follows:
@Transient
private final Logger log = LoggerFactory.getLogger(getClass());
but it doesn't make sense that every document has its own field for the logger
although the logger will be singleton.
Using `static` is much better as follows:
private static final Logger log = LoggerFactory.getLogger(NaboxImpressionLog.class);
Reference:
http://stackoverflow.com/questions/21734785/mongodb-spring-saving-an-object-causes-stackoverflowerror
java.lang.StackOverflowError
at java.lang.String.<init>(String.java:201)
at java.lang.String.substring(String.java:1956)
at sun.reflect.misc.ReflectUtil.isNonPublicProxyClass(ReflectUtil.java:288)
at sun.reflect.misc.ReflectUtil.checkPackageAccess(ReflectUtil.java:165)
at sun.reflect.misc.ReflectUtil.isPackageAccessible(ReflectUtil.java:195)
at java.beans.MethodRef.get(MethodRef.java:72)
at java.beans.PropertyDescriptor.getReadMethod(PropertyDescriptor.java:206)
at org.springframework.data.mapping.model.AbstractPersistentProperty.getGetter(AbstractPersistentProperty.java:159)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:127)
at org.springframework.data.mapping.model.BeanWrapper.getProperty(BeanWrapper.java:100)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:419)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:412)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:307)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:412)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeInternal(MappingMongoConverter.java:386)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writeCollectionInternal(MappingMongoConverter.java:622)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.createCollection(MappingMongoConverter.java:546)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.writePropertyInternal(MappingMongoConverter.java:457)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:424)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$3.doWithPersistentProperty(MappingMongoConverter.java:412)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:307)
I didn't dig into the problem due to lack of time
but the culprit was `Logger` in a MongoDB document as follows:
private final Logger log = LoggerFactory.getLogger(getClass());
It could be fixed with `org.springframework.data.annotation.Transient` as follows:
@Transient
private final Logger log = LoggerFactory.getLogger(getClass());
but it doesn't make sense that every document has its own field for the logger
although the logger will be singleton.
Using `static` is much better as follows:
private static final Logger log = LoggerFactory.getLogger(NaboxImpressionLog.class);
Reference:
http://stackoverflow.com/questions/21734785/mongodb-spring-saving-an-object-causes-stackoverflowerror
Sunday, June 21, 2015
Error:Cause: net.rubygrapefruit.platform.internal.jni.WindowsHandleFunctions.markStandardHandlesUninheritable(Lnet/rubygrapefruit/platform/internal/FunctionResult;)V
I encountered the following error in IntelliJ when I tried to run a Gradle task:
Error:Cause: net.rubygrapefruit.platform.internal.jni.WindowsHandleFunctions.markStandardHandlesUninheritable(Lnet/rubygrapefruit/platform/internal/FunctionResult;)V
I can run a Gradle task in a terminal but I can't run a Gradle task in IntelliJ any more.
I'll update if I find any.
Any hint to solve this problem will be appreciatied.
---
UPDATED:
Restarting IntelliJ solved the problem :-)
Error:Cause: net.rubygrapefruit.platform.internal.jni.WindowsHandleFunctions.markStandardHandlesUninheritable(Lnet/rubygrapefruit/platform/internal/FunctionResult;)V
I can run a Gradle task in a terminal but I can't run a Gradle task in IntelliJ any more.
I'll update if I find any.
Any hint to solve this problem will be appreciatied.
---
UPDATED:
Restarting IntelliJ solved the problem :-)
Wednesday, June 17, 2015
java.lang.IllegalArgumentException: Cannot locate declared field class org.apache.http.impl.client.HttpClientBuilder.sslcontext
When you try to use `HtmlUnit`,
you might encounter the following exception:
java.lang.IllegalArgumentException: Cannot locate declared field class org.apache.http.impl.client.HttpClientBuilder.sslcontext
You're using httpcomponents 4.5.
If you don't need the version,
downgrade it to 4.4.1 as follows:
testCompile("org.apache.httpcomponents:httpclient:4.4.1")
testCompile("org.apache.httpcomponents:httpmime:4.4.1")
References:
http://htmlunit.10904.n7.nabble.com/HtmlUnit-htmlunit-bugs-1692-Update-to-HttpComponents-4-5-td36318.html
you might encounter the following exception:
java.lang.IllegalArgumentException: Cannot locate declared field class org.apache.http.impl.client.HttpClientBuilder.sslcontext
You're using httpcomponents 4.5.
If you don't need the version,
downgrade it to 4.4.1 as follows:
testCompile("org.apache.httpcomponents:httpclient:4.4.1")
testCompile("org.apache.httpcomponents:httpmime:4.4.1")
References:
http://htmlunit.10904.n7.nabble.com/HtmlUnit-htmlunit-bugs-1692-Update-to-HttpComponents-4-5-td36318.html
Tuesday, June 16, 2015
Show all indexes in MongoDB
To show all indexes in MongoDB,
you can use the following command:
db.someCollection.getIndexes()
Reference:
http://docs.mongodb.org/manual/tutorial/list-indexes/
you can use the following command:
db.someCollection.getIndexes()
Reference:
http://docs.mongodb.org/manual/tutorial/list-indexes/
Relocate MongoDB storage location in CentOS
To relocate MongoDB storage location in CentOS,
stop MongoDB server with the following command:
sudo /sbin/service mongod stop
Modify MongoDB configuration as follows:
sudo vi /etc/mongod.conf
#dbpath=/var/lib/mongo
dbpath=/home/izeye/mongo
Start MongoDB server with the following command:
sudo /sbin/service mongod start
Note that the directory should be owned by `mongod` user.
If not, change it by the following command:
sudo chown mongod /home/izeye/mongo/
stop MongoDB server with the following command:
sudo /sbin/service mongod stop
Modify MongoDB configuration as follows:
sudo vi /etc/mongod.conf
#dbpath=/var/lib/mongo
dbpath=/home/izeye/mongo
Start MongoDB server with the following command:
sudo /sbin/service mongod start
Note that the directory should be owned by `mongod` user.
If not, change it by the following command:
sudo chown mongod /home/izeye/mongo/
Caused by: com.mongodb.WriteConcernException: { "serverUsed" : "1.2.3.4:27017" , "ok" : 1 , "n" : 0 , "err" : "new file allocation failure" , "code" : 12520}
You might encounter the following exception:
Caused by: com.mongodb.WriteConcernException: { "serverUsed" : "1.2.3.4:27017" , "ok" : 1 , "n" : 0 , "err" : "new file allocation failure" , "code" : 12520}
You can check your MongoDB server log with the following command:
sudo vi /var/log/mongodb/mongod.log
The log will have the following lines:
2015-06-10T20:19:19.311+0900 I - [conn7] Assertion: 12520:new file allocation failure
2015-06-10T20:19:19.313+0900 I STORAGE [FileAllocator] allocating new datafile /var/lib/mongo/test.6, filling with zeroes...
2015-06-10T20:19:19.316+0900 I - [conn7] Assertion: 12520:new file allocation failure
2015-06-10T20:19:19.316+0900 I STORAGE [FileAllocator] FileAllocator: posix_fallocate failed: errno:28 No space left on device falling back
2015-06-10T20:19:19.317+0900 I STORAGE [FileAllocator] error: failed to allocate new file: /var/lib/mongo/test.6 size: 2146435072 failure creating new datafile; lseek failed for fd 22 with errno: errno:2 No such file or directory. will try again in 10 seconds
2015-06-10T20:19:19.322+0900 I - [conn7] Assertion: 12520:new file allocation failure
You can notice there is not enough space.
So you can clean up your disk
or relocate MongoDB storage location
to a directory mounted on a different partition or disk.
Caused by: com.mongodb.WriteConcernException: { "serverUsed" : "1.2.3.4:27017" , "ok" : 1 , "n" : 0 , "err" : "new file allocation failure" , "code" : 12520}
You can check your MongoDB server log with the following command:
sudo vi /var/log/mongodb/mongod.log
The log will have the following lines:
2015-06-10T20:19:19.311+0900 I - [conn7] Assertion: 12520:new file allocation failure
2015-06-10T20:19:19.313+0900 I STORAGE [FileAllocator] allocating new datafile /var/lib/mongo/test.6, filling with zeroes...
2015-06-10T20:19:19.316+0900 I - [conn7] Assertion: 12520:new file allocation failure
2015-06-10T20:19:19.316+0900 I STORAGE [FileAllocator] FileAllocator: posix_fallocate failed: errno:28 No space left on device falling back
2015-06-10T20:19:19.317+0900 I STORAGE [FileAllocator] error: failed to allocate new file: /var/lib/mongo/test.6 size: 2146435072 failure creating new datafile; lseek failed for fd 22 with errno: errno:2 No such file or directory. will try again in 10 seconds
2015-06-10T20:19:19.322+0900 I - [conn7] Assertion: 12520:new file allocation failure
You can notice there is not enough space.
So you can clean up your disk
or relocate MongoDB storage location
to a directory mounted on a different partition or disk.
Sunday, June 14, 2015
Caused by: java.lang.NoClassDefFoundError: org/w3c/dom/ElementTraversal
You can encounter the following error after upgrading to Spring Boot 1.3.0.M1:
Caused by: java.lang.NoClassDefFoundError: org/w3c/dom/ElementTraversal
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.xerces.parsers.AbstractDOMParser.startDocument(Unknown Source)
at org.apache.xerces.impl.dtd.XMLDTDValidator.startDocument(Unknown Source)
at org.apache.xerces.impl.XMLDocumentScannerImpl.startEntity(Unknown Source)
at org.apache.xerces.impl.XMLVersionDetector.startDocumentParsing(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at org.apache.ibatis.parsing.XPathParser.createDocument(XPathParser.java:254)
at org.apache.ibatis.parsing.XPathParser.<init>(XPathParser.java:125)
at org.apache.ibatis.builder.xml.XMLConfigBuilder.<init>(XMLConfigBuilder.java:75)
at org.mybatis.spring.SqlSessionFactoryBean.buildSqlSessionFactory(SqlSessionFactoryBean.java:358)
at org.mybatis.spring.SqlSessionFactoryBean.afterPropertiesSet(SqlSessionFactoryBean.java:340)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1637)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1574)
With the hint from StackOverflow and the following command:
gradle dependencies
I can figure out that `xml-apis-1.4.01` has been changed to `xml-apis-1.3.04` as follows:
Before upgrading to Spring Boot 1.3.0.M1:
+--- net.sourceforge.nekohtml:nekohtml:1.9.22
| \--- xerces:xercesImpl:2.11.0
| \--- xml-apis:xml-apis:1.4.01
After upgrading to Spring Boot 1.3.0.M1:
+--- net.sourceforge.nekohtml:nekohtml:1.9.22
| \--- xerces:xercesImpl:2.11.0
| \--- xml-apis:xml-apis:1.4.01 -> 1.3.04
With the following command:
gradle dependencyInsight --dependency xml-apis
you will get the following result:
:dependencyInsight
xml-apis:xml-apis:1.3.04 (selected by rule)
xml-apis:xml-apis:1.4.01 -> 1.3.04
\--- xerces:xercesImpl:2.11.0
\--- net.sourceforge.nekohtml:nekohtml:1.9.22
\--- compile
`spring-boot-dependencies` enforces `1.3.04` as follows:
<xml-apis.version>1.3.04</xml-apis.version>
You can resolve the problem by using `xml-apis:1.4.01` as follows:
compile("xml-apis:xml-apis:1.4.01")
I created a PR to upgrade `xml-apis as follows:
https://github.com/spring-projects/spring-boot/pull/3226
References:
http://stackoverflow.com/questions/10234201/appengine-error-java-lang-noclassdeffounderror-org-w3c-dom-elementtraversal
https://github.com/spring-projects/spring-boot/blob/master/spring-boot-dependencies/pom.xml
Caused by: java.lang.NoClassDefFoundError: org/w3c/dom/ElementTraversal
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.xerces.parsers.AbstractDOMParser.startDocument(Unknown Source)
at org.apache.xerces.impl.dtd.XMLDTDValidator.startDocument(Unknown Source)
at org.apache.xerces.impl.XMLDocumentScannerImpl.startEntity(Unknown Source)
at org.apache.xerces.impl.XMLVersionDetector.startDocumentParsing(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
at org.apache.ibatis.parsing.XPathParser.createDocument(XPathParser.java:254)
at org.apache.ibatis.parsing.XPathParser.<init>(XPathParser.java:125)
at org.apache.ibatis.builder.xml.XMLConfigBuilder.<init>(XMLConfigBuilder.java:75)
at org.mybatis.spring.SqlSessionFactoryBean.buildSqlSessionFactory(SqlSessionFactoryBean.java:358)
at org.mybatis.spring.SqlSessionFactoryBean.afterPropertiesSet(SqlSessionFactoryBean.java:340)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1637)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1574)
With the hint from StackOverflow and the following command:
gradle dependencies
I can figure out that `xml-apis-1.4.01` has been changed to `xml-apis-1.3.04` as follows:
Before upgrading to Spring Boot 1.3.0.M1:
+--- net.sourceforge.nekohtml:nekohtml:1.9.22
| \--- xerces:xercesImpl:2.11.0
| \--- xml-apis:xml-apis:1.4.01
After upgrading to Spring Boot 1.3.0.M1:
+--- net.sourceforge.nekohtml:nekohtml:1.9.22
| \--- xerces:xercesImpl:2.11.0
| \--- xml-apis:xml-apis:1.4.01 -> 1.3.04
With the following command:
gradle dependencyInsight --dependency xml-apis
you will get the following result:
:dependencyInsight
xml-apis:xml-apis:1.3.04 (selected by rule)
xml-apis:xml-apis:1.4.01 -> 1.3.04
\--- xerces:xercesImpl:2.11.0
\--- net.sourceforge.nekohtml:nekohtml:1.9.22
\--- compile
`spring-boot-dependencies` enforces `1.3.04` as follows:
<xml-apis.version>1.3.04</xml-apis.version>
You can resolve the problem by using `xml-apis:1.4.01` as follows:
compile("xml-apis:xml-apis:1.4.01")
I created a PR to upgrade `xml-apis as follows:
https://github.com/spring-projects/spring-boot/pull/3226
References:
http://stackoverflow.com/questions/10234201/appengine-error-java-lang-noclassdeffounderror-org-w3c-dom-elementtraversal
https://github.com/spring-projects/spring-boot/blob/master/spring-boot-dependencies/pom.xml
Friday, June 12, 2015
Error creating bean with name 'mbeanExporter' defined in class path resource [org/springframework/boot/autoconfigure/jmx/JmxAutoConfiguration.class]
When you have the following JMX configuration with Spring Boot,
@Bean
public MBeanServerFactoryBean mBeanServer() {
MBeanServerFactoryBean mBeanServerFactoryBean = new MBeanServerFactoryBean();
mBeanServerFactoryBean.setLocateExistingServerIfPossible(true);
return mBeanServerFactoryBean;
}
you can encounter the following exception:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'mbeanExporter' defined in class path resource [org/springframework/boot/autoconfigure/jmx/JmxAutoConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.jmx.export.annotation.AnnotationMBeanExporter]: Factory method 'mbeanExporter' threw exception; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'mbeanServer' is defined
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:599)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1119)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1014)
If you should keep your own `MBeanServer`,
set the following property in `application.properties`:
spring.jmx.server=mBeanServer
If you don't have to keep your own `MBeanServer`,
drop your own JMX configuration for `MBeanServer`
and use one provided by Spring Boot.
Reference:
http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/
@Bean
public MBeanServerFactoryBean mBeanServer() {
MBeanServerFactoryBean mBeanServerFactoryBean = new MBeanServerFactoryBean();
mBeanServerFactoryBean.setLocateExistingServerIfPossible(true);
return mBeanServerFactoryBean;
}
you can encounter the following exception:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'mbeanExporter' defined in class path resource [org/springframework/boot/autoconfigure/jmx/JmxAutoConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.jmx.export.annotation.AnnotationMBeanExporter]: Factory method 'mbeanExporter' threw exception; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'mbeanServer' is defined
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:599)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1119)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1014)
If you should keep your own `MBeanServer`,
set the following property in `application.properties`:
spring.jmx.server=mBeanServer
If you don't have to keep your own `MBeanServer`,
drop your own JMX configuration for `MBeanServer`
and use one provided by Spring Boot.
Reference:
http://docs.spring.io/spring-boot/docs/current-SNAPSHOT/reference/htmlsingle/
Thursday, June 11, 2015
List all indexes in MongoDB
To list all indexes in MongoDB,
you can do the following:
> db.someCollection.getIndexes()
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "test.someCollection"
},
{
"v" : 1,
"key" : {
"someIndexProperty" : 1
},
"name" : "someIndexProperty_1",
"ns" : "test.someCollection"
}
]
>
Reference:
http://docs.mongodb.org/manual/tutorial/list-indexes/
you can do the following:
> db.someCollection.getIndexes()
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"name" : "_id_",
"ns" : "test.someCollection"
},
{
"v" : 1,
"key" : {
"someIndexProperty" : 1
},
"name" : "someIndexProperty_1",
"ns" : "test.someCollection"
}
]
>
Reference:
http://docs.mongodb.org/manual/tutorial/list-indexes/
Create an index in MongoDB
To create an index in MongoDB,
you can do the following:
> db.someCollection.createIndex({someIndexProperty: 1})
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
>
Reference:
http://docs.mongodb.org/manual/tutorial/create-an-index/
you can do the following:
> db.someCollection.createIndex({someIndexProperty: 1})
{
"createdCollectionAutomatically" : false,
"numIndexesBefore" : 1,
"numIndexesAfter" : 2,
"ok" : 1
}
>
Reference:
http://docs.mongodb.org/manual/tutorial/create-an-index/
Tuesday, June 9, 2015
Oracle VARCHAR2's length is BYTE or CHAR?
Oracle VARCHAR2's length is BYTE or CHAR?
It depends on an option you used when creating a column as follows:
VARCHAR2(20 BYTE)
VARCHAR2(20 CHAR)
What if you omit the option as follows?
VARCHAR2(20)
It depends on the global setting for length semantics.
You can check the setting as follows:
SELECT VALUE FROM NLS_DATABASE_PARAMETERS WHERE PARAMETER = 'NLS_LENGTH_SEMANTICS';
BYTE
References:
http://docs.oracle.com/cd/B28359_01/server.111/b28318/datatype.htm#CNCPT1824
http://oracleschools.blogspot.kr/2013/01/how-to-check-nlslengthsemantics.html
It depends on an option you used when creating a column as follows:
VARCHAR2(20 BYTE)
VARCHAR2(20 CHAR)
What if you omit the option as follows?
VARCHAR2(20)
It depends on the global setting for length semantics.
You can check the setting as follows:
SELECT VALUE FROM NLS_DATABASE_PARAMETERS WHERE PARAMETER = 'NLS_LENGTH_SEMANTICS';
BYTE
References:
http://docs.oracle.com/cd/B28359_01/server.111/b28318/datatype.htm#CNCPT1824
http://oracleschools.blogspot.kr/2013/01/how-to-check-nlslengthsemantics.html
How to use `sort` in Spring Data REST
To use `sort` in Spring Data REST,
you can use as follows:
http://localhost:{port}/api/customers?sort=firstName,desc
You can reference the following sample project:
https://github.com/izeye/samples-spring-boot-branches/tree/jpa-and-data-rest
Reference:
http://docs.spring.io/spring-data/rest/docs/current/reference/html/
you can use as follows:
http://localhost:{port}/api/customers?sort=firstName,desc
You can reference the following sample project:
https://github.com/izeye/samples-spring-boot-branches/tree/jpa-and-data-rest
Reference:
http://docs.spring.io/spring-data/rest/docs/current/reference/html/
Monday, June 8, 2015
MongoDB installation location in Windows
MongoDB installation location in Windows is as follows:
C:\Program Files\MongoDB\Server\3.0\bin
C:\Program Files\MongoDB\Server\3.0\bin
Allow all interfaces to listen for MongoDB in CentOS 6.6
To allow all interfaces to listen for MongoDB in CentOS 6.6,
comment out the `bind_ip` as follows:
sudo vi /etc/mongod.conf
# Listen to local interface only. Comment out to listen on all interfaces.
#bind_ip=127.0.0.1
Restart the MongoDB server (mongod) as follows:
sudo /sbin/service mongod start
comment out the `bind_ip` as follows:
sudo vi /etc/mongod.conf
# Listen to local interface only. Comment out to listen on all interfaces.
#bind_ip=127.0.0.1
Restart the MongoDB server (mongod) as follows:
sudo /sbin/service mongod start
Change MongoDB server port in CentOS 6.6
To change MongoDB server port in CentOS 6.6,
edit the following configuration:
sudo vi /etc/mongod.conf
port=10000
Restart the MongoDB server (mongod) as follows:
sudo /sbin/service mongod start
Reference:
http://docs.mongodb.org/manual/reference/configuration-options/
edit the following configuration:
sudo vi /etc/mongod.conf
port=10000
Restart the MongoDB server (mongod) as follows:
sudo /sbin/service mongod start
Reference:
http://docs.mongodb.org/manual/reference/configuration-options/
Install MongoDB on CentOS 6.6
To install MongoDB on CentOS 6.6,
you add MongoDB repository as follows:
sudo vi /etc/yum.repos.d/mongodb-org-3.0.repo
[mongodb-org-3.0]
name=MongoDB Repository
baseurl=http://repo.mongodb.org/yum/redhat/$releasever/mongodb-org/3.0/x86_64/
gpgcheck=0
enabled=1
Install MongoDB by `yum` as follows:
sudo yum install -y mongodb-org
Start MongoDB by `service` as follows:
sudo /sbin/service mongod start
Check listening port as follows:
sudo vi /var/log/mongodb/mongod.log
2015-06-08T21:14:23.908+0900 I NETWORK [initandlisten] waiting for connections on port 27017
Run MongoDB client as follows:
mongo
Reference:
http://docs.mongodb.org/manual/tutorial/install-mongodb-on-red-hat/
you add MongoDB repository as follows:
sudo vi /etc/yum.repos.d/mongodb-org-3.0.repo
[mongodb-org-3.0]
name=MongoDB Repository
baseurl=http://repo.mongodb.org/yum/redhat/$releasever/mongodb-org/3.0/x86_64/
gpgcheck=0
enabled=1
Install MongoDB by `yum` as follows:
sudo yum install -y mongodb-org
Start MongoDB by `service` as follows:
sudo /sbin/service mongod start
Check listening port as follows:
sudo vi /var/log/mongodb/mongod.log
2015-06-08T21:14:23.908+0900 I NETWORK [initandlisten] waiting for connections on port 27017
Run MongoDB client as follows:
mongo
Reference:
http://docs.mongodb.org/manual/tutorial/install-mongodb-on-red-hat/
java.lang.IllegalArgumentException: Cannot subclass final class class com.sun.proxy.$Proxy52
When you are using Spring Data MongoDB,
you can encounter the following exception:
Caused by: java.lang.IllegalArgumentException: Cannot subclass final class class com.sun.proxy.$Proxy52
at org.springframework.cglib.proxy.Enhancer.generateClass(Enhancer.java:446)
at org.springframework.cglib.transform.TransformingClassGenerator.generateClass(TransformingClassGenerator.java:33)
It happened by adding `@Repository` accidentally as follows:
@Repository
public interface SomethingRepository extends MongoRepository<Something, String> {
}
Removing `@Repository` fixes the problem.
It would be better to have an exception having better explanation if possible.
Reference:
https://github.com/spring-projects/spring-boot/issues/1929
you can encounter the following exception:
Caused by: java.lang.IllegalArgumentException: Cannot subclass final class class com.sun.proxy.$Proxy52
at org.springframework.cglib.proxy.Enhancer.generateClass(Enhancer.java:446)
at org.springframework.cglib.transform.TransformingClassGenerator.generateClass(TransformingClassGenerator.java:33)
It happened by adding `@Repository` accidentally as follows:
@Repository
public interface SomethingRepository extends MongoRepository<Something, String> {
}
Removing `@Repository` fixes the problem.
It would be better to have an exception having better explanation if possible.
Reference:
https://github.com/spring-projects/spring-boot/issues/1929
Hotfix KB2731284 or later update is not installed, will zero-out data files
When you run a MongoDB server (mongod) in Windows,
you can encounter the following error:
C:\Program Files\MongoDB\Server\3.0\bin>mongod
2015-06-08T20:29:47.485+0900 I CONTROL Hotfix KB2731284 or later update is not installed, will zero-out data files
2015-06-08T20:29:47.565+0900 I STORAGE [initandlisten] exception in initAndListen: 29 Data directory C:\data\db\ not found., terminating
2015-06-08T20:29:47.565+0900 I CONTROL [initandlisten] dbexit: rc: 100
You can fix it by creating the following directory:
C:\data\db
Unfortunately I can't understand why it fixed the problem.
Anyone who explains why?
Reference:
http://stackoverflow.com/questions/29316482/mongo-error-on-i-control-hotfix
you can encounter the following error:
C:\Program Files\MongoDB\Server\3.0\bin>mongod
2015-06-08T20:29:47.485+0900 I CONTROL Hotfix KB2731284 or later update is not installed, will zero-out data files
2015-06-08T20:29:47.565+0900 I STORAGE [initandlisten] exception in initAndListen: 29 Data directory C:\data\db\ not found., terminating
2015-06-08T20:29:47.565+0900 I CONTROL [initandlisten] dbexit: rc: 100
You can fix it by creating the following directory:
C:\data\db
Unfortunately I can't understand why it fixed the problem.
Anyone who explains why?
Reference:
http://stackoverflow.com/questions/29316482/mongo-error-on-i-control-hotfix
Saturday, June 6, 2015
Show all sub-projects' dependencies in Gradle
In Gradle, to show all sub-projects' dependencies,
add the following to `build.gradle`.
subprojects {
task allDependencies(type: DependencyReportTask) {}
}
Now you can use the following command:
gradle allDependencies
Reference:
https://solidsoft.wordpress.com/2014/11/13/gradle-tricks-display-dependencies-for-all-subprojects-in-multi-project-build/
add the following to `build.gradle`.
subprojects {
task allDependencies(type: DependencyReportTask) {}
}
Now you can use the following command:
gradle allDependencies
Reference:
https://solidsoft.wordpress.com/2014/11/13/gradle-tricks-display-dependencies-for-all-subprojects-in-multi-project-build/
Caused by: java.lang.NoSuchMethodError: org.springframework.web.servlet.config.annotation.ResourceHandlerRegistry.hasMappingForPattern(Ljava/lang/String;)Z
You can encounter the following error:
Caused by: java.lang.NoSuchMethodError: org.springframework.web.servlet.config.annotation.ResourceHandlerRegistry.hasMappingForPattern(Ljava/lang/String;)Z
at org.springframework.boot.autoconfigure.web.WebMvcAutoConfiguration$WebMvcAutoConfigurationAdapter.addResourceHandlers(WebMvcAutoConfiguration.java:256)
at org.springframework.web.servlet.config.annotation.WebMvcConfigurerComposite.addResourceHandlers(WebMvcConfigurerComposite.java:102)
Using `gradle dependencies` didn't help me much.
It showed the right dependency I expected:
org.springframework:spring-webmvc:4.1.6.RELEASE
It's one of Gradle sub-projects.
It is caused by another sub-project which it depends on.
The sub-project has a dependency having the following transitive dependency:
org.springframework.mobile:spring-mobile-device:1.1.3.RELEASE
And it has the following transitive dependency:
org.springframework:spring-webmvc:3.2.11.RELEASE
So the conflict occurred.
I solved it as follows:
compile("xxx:xxx:1.0.0") {
exclude module: 'spring-mobile-device'
}
But I can't understand why the latest dependency doesn't win.
Caused by: java.lang.NoSuchMethodError: org.springframework.web.servlet.config.annotation.ResourceHandlerRegistry.hasMappingForPattern(Ljava/lang/String;)Z
at org.springframework.boot.autoconfigure.web.WebMvcAutoConfiguration$WebMvcAutoConfigurationAdapter.addResourceHandlers(WebMvcAutoConfiguration.java:256)
at org.springframework.web.servlet.config.annotation.WebMvcConfigurerComposite.addResourceHandlers(WebMvcConfigurerComposite.java:102)
Using `gradle dependencies` didn't help me much.
It showed the right dependency I expected:
org.springframework:spring-webmvc:4.1.6.RELEASE
It's one of Gradle sub-projects.
It is caused by another sub-project which it depends on.
The sub-project has a dependency having the following transitive dependency:
org.springframework.mobile:spring-mobile-device:1.1.3.RELEASE
And it has the following transitive dependency:
org.springframework:spring-webmvc:3.2.11.RELEASE
So the conflict occurred.
I solved it as follows:
compile("xxx:xxx:1.0.0") {
exclude module: 'spring-mobile-device'
}
But I can't understand why the latest dependency doesn't win.
Subscribe to:
Posts (Atom)