Tigase XMPP Server Board

API and development: RE: Connection time optimizations

Fri, 01/29/2016 - 02:15

It's "enabled" by default, i.e. it was never mandatory and simply advertised as possible in the stream futures.

Categories: Tigase Forums

API and development: RE: Connection time optimizations

Fri, 01/29/2016 - 01:51

Interesting Artur. Is it enabled per default or can it be enabled in the property file?

Categories: Tigase Forums

Installation and maintenance: RE: Too many close_wait connections

Fri, 01/29/2016 - 01:39

Did you have a chance to check provided details? Please help us with this.

Categories: Tigase Forums

API and development: RE: Connection time optimizations

Thu, 01/28/2016 - 13:39

As a matter of fact you can. Tigase has always been allowing to skip the step, even during the RFC 3921 time.

Categories: Tigase Forums

API and development: RE: Connection time optimizations

Thu, 01/28/2016 - 13:14

Thanks Artur,

I found one thing which we can actually omit:

If your server is new enough (so implements RFC 6121 correctly), you can skip a roundtrip by skipping the urn:ietf:params:xml:ns:xmpp-session IQ. See https://datatracker.ietf.org/doc/draft-cridland-xmpp-session/?include_text=1. The Extensible Messaging and Presence Protocol (XMPP) historically had a Session Establishment request defined in RFC 3921 which clients were required to perform at the beginning of a session. RFC 6121 dropped this entirely. This specification reinstates it as an optional no-op to aid backwards compability, matching commonly deployed workarounds.

and this is also mentioned here https://tools.ietf.org/html/rfc6121#page-112

Can we skip this step with Tigase?

Categories: Tigase Forums

API and development: RE: Connection time optimizations

Wed, 01/27/2016 - 14:32

Igor Khomenko wrote:

Hi again there,

I'm back now to this question and we are still not satisfied with the standard XMPP login logic/duration.

Did you try our suggestion to use stream management as suggested above. With long stream management timeouts on the server side this would give you the fastest possible reconnection time. I do not know WhatsApp or Viber implementation details but I believe they must use something similar to ensure both fast and secure reconnection.

I just did a quick research and found some interesting links:

This looks promising but I do not think the XEP is well thought through. The idea is good and with some modifications and polishing it should improve login time. I am not sure, however, how significant this improvement would be.

I think you are mistaken with this one, the information on the page is a bit misleading. The page describes only user authentication part of the user login, not the entire login handshaking. I really do not see anything "quick" on there. These 2 round trips for authentication is actually a typical exchange for SASL or non-SASL authentication. After and before authentication there is some more, time consuming, handshaking necessary. The XEP above attempts to address is, I mean the whole login process. But, really nothing can be as quick as stream management reconnection because it bypasses all the login handshaking.

It's definitely a room for the improvements. All modern chat apps like WhatsApp, Viber connect really fast.

Comments given above.

Do you have any visibility on this, will it be easily possible for Tigase community to minimise all needed roundtrips via code customizations or it's too hard to do it from the architecture point of view?

After 3 years adoption of the mentioned XEP is minimal and I think the reason for this is rather insignificant improvement it provides. I agree there is a room for improvement and the login should/could be much faster but I think the right direction is to work on the stream management improvements (spec and implementation) to make it more useful and provide a way to quickly reconnect for mobile devices.

Categories: Tigase Forums

API and development: RE: Connection time optimizations

Wed, 01/27/2016 - 13:20

Hi again there,

I'm back now to this question and we are still not satisfied with the standard XMPP login logic/duration.

I just did a quick research and found some interesting links:

It's definitely a room for the improvements. All modern chat apps like WhatsApp, Viber connect really fast.

Do you have any visibility on this, will it be easily possible for Tigase community to minimise all needed roundtrips via code customizations or it's too hard to do it from the architecture point of view?

Categories: Tigase Forums

Installation and maintenance: RE: Too many close_wait connections

Wed, 01/27/2016 - 05:20

Hi,
my config
config-type = --gen-config-def --admins = 1-1@chat.test # setup DBs --user-db = mysql --user-db-uri = -------------------- --data-repo-pool-size = 30 # set domain --virt-hosts = chat.test --ssl-def-cert-domain = chat.test --sm-plugins=-jabber:iq:private,-jabber:iq:register,-jabber:iq:auth,-jabber:iq:version,-msgoffline,-message,-vcard-temp,+message-carbons,+messagetocustomobject,+lastrequestat --comp-name-1 = ext --comp-class-1 = tigase.server.ext.ComponentProtocol --external = muc.chat.test:muc-secret:listen:5271:10.0.10.20:ReceiverBareJidLB # new config for HTTP component --comp-name-3=http --comp-class-3=tigase.http.HttpMessageReceiver http/http/server-class=tigase.http.jetty.JettyStandaloneHttpServer http/rest/api-keys[s]=---------------------- http/http/port[I]=8083 # enable WebSockets ws2s/connections/ports[i]=5290,5291 ws2s/connections/5291/socket=ssl ws2s/connections/5291/type=accept bosh/connections/ports[i]=5280,5281 bosh/connections/5281/socket=ssl bosh/connections/5281/type=accept # enable XEP-0198 c2s/processors[s]=urn:xmpp:sm:3 c2s/processors/urn\:xmpp\:sm\:3/resumption-timeout[I]=60 c2s/processors/urn\:xmpp\:sm\:3/ack-request-count[I]=1 # enable monitoring --monitoring = http:9080 --cm-traffic-throttling = xmpp:600:0:disc,bin:0:0:disc --cm-ht-traffic-throttling = xmpp:0:0:disc,bin:0:0:disc

There are no errors or warnings in logs except c2s connection

Categories: Tigase Forums

Installation and maintenance: RE: Too many close_wait connections

Tue, 01/26/2016 - 06:29

What exact requests are you making? What is your (Tigase) configuration (I assume it's not default, at least judging by the ports). Because you are referring to HTTP component yet you provide excerpt from the logs with are related to the c2s connection (regular XMPP client socket connections over port 5222).

Bottomline - please provide more details and comments.

Categories: Tigase Forums

Installation and maintenance: RE: Users not seems in database when create user using Spark Client...

Tue, 01/26/2016 - 06:17

Thank you for attaching the logs. It looks like Tigase started correctly so there shouldn't be any issue and the configuration is also corect.

You may include also complete logs (i.e. logs/tigase.log.* files, please compress them before) with the clear case (i.e. shutdown tigase, clear @logs/ directory, start Tigase, reproduce the issue and then grab the logs). In addition including complete client logs will also be helpful.

Naitik Vithalani wrote:

Can you please review the log and guide us at your earliest?

This is a community support board and while we try our best to respond as soon as possible to all messages it's not always possible. You may be interested in our support options - please contact us using this form: http://tigase.net/contact

Categories: Tigase Forums

API and development: RE: Password verification problem

Tue, 01/26/2016 - 06:01

The error states:

Caused by: tigase.db.UserNotFoundException: User does not exist: tsunga23319@tig.g.com, in database: jdbc:mysql://localhost:3306/tigasedb?user=root&password=test&useUnicode=true&characterEncoding=UTF-8&autoCreateUser=true

Please make sure that user tsunga23319@tig.g.com exists in database: tigasedb

Categories: Tigase Forums

Installation and maintenance: Too many close_wait connections

Tue, 01/26/2016 - 02:17

Hi,
We configure HTTP component, and have many connections to tigase, but when our application close connection - tigase doesn't.
Help me please understand what's the problem.

lsof -i -n -P |grep 8083

ava 19147 chat 3724u IPv6 177089053 0t0 TCP 172.31.19.232:8083->172.31.18.195:32833 (CLOSE_WAIT) java 19147 chat 3725u IPv6 177089364 0t0 TCP 172.31.19.232:8083->172.31.18.195:32994 (CLOSE_WAIT) java 19147 chat 3726u IPv6 177089070 0t0 TCP 172.31.19.232:8083->172.31.18.195:32844 (CLOSE_WAIT) java 19147 chat 3727u IPv6 177089166 0t0 TCP 172.31.19.232:8083->172.31.18.195:32922 (CLOSE_WAIT) java 19147 chat 3728u IPv6 177090536 0t0 TCP 172.31.19.232:8083->172.31.18.195:33136 (CLOSE_WAIT) java 19147 chat 3729u IPv6 177091439 0t0 TCP 172.31.19.232:8083->172.31.18.195:33700 (CLOSE_WAIT) java 19147 chat 3730u IPv6 177094199 0t0 TCP 172.31.19.232:8083->172.31.18.195:34070 (CLOSE_WAIT) java 19147 chat 3731u IPv6 177098787 0t0 TCP 172.31.19.232:8083->172.31.18.195:35102 (CLOSE_WAIT) java 19147 chat 3732u IPv6 177089405 0t0 TCP 172.31.19.232:8083->172.31.18.195:33067 (CLOSE_WAIT) java 19147 chat 3734u IPv6 177090440 0t0 TCP 172.31.19.232:8083->172.31.18.195:33134 (CLOSE_WAIT) java 19147 chat 3735u IPv6 177091750 0t0 TCP 172.31.19.232:8083->172.31.18.195:33278 (CLOSE_WAIT) java 19147 chat 3736u IPv6 177091710 0t0 TCP 172.31.19.232:8083->172.31.18.195:33264 (CLOSE_WAIT) java 19147 chat 3737u IPv6 177090648 0t0 TCP 172.31.19.232:8083->172.31.18.195:33212 (CLOSE_WAIT) java 19147 chat 3738u IPv6 177091709 0t0 TCP 172.31.19.232:8083->172.31.18.195:33213 (CLOSE_WAIT) java 19147 chat 3740u IPv6 177090848 0t0 TCP 172.31.19.232:8083->172.31.18.195:33281 (CLOSE_WAIT) java 19147 chat 3741u IPv6 177090850 0t0 TCP 172.31.19.232:8083->172.31.18.195:33326 (CLOSE_WAIT) java 19147 chat 3743u IPv6 177092853 0t0 TCP 172.31.19.232:8083->172.31.18.195:33754 (CLOSE_WAIT) java 19147 chat 3745u IPv6 177097352 0t0 TCP 172.31.19.232:8083->172.31.18.195:34934 (CLOSE_WAIT) java 19147 chat 3746u IPv6 177098831 0t0 TCP 172.31.19.232:8083->172.31.18.195:35153 (CLOSE_WAIT) java 19147 chat 3747u IPv6 177093859 0t0 TCP 172.31.19.232:8083->172.31.18.195:33902 (CLOSE_WAIT) java 19147 chat 3748u IPv6 177093108 0t0 TCP 172.31.19.232:8083->172.31.18.195:33835 (CLOSE_WAIT) java 19147 chat 3751u IPv6 177094314 0t0 TCP 172.31.19.232:8083->172.31.18.195:34125 (CLOSE_WAIT) java 19147 chat 3752u IPv6 177094318 0t0 TCP 172.31.19.232:8083->172.31.18.195:34202 (CLOSE_WAIT) java 19147 chat 3755u IPv6 177095475 0t0 TCP 172.31.19.232:8083->172.31.18.195:34472 (CLOSE_WAIT) java 19147 chat 3757u IPv6 177101012 0t0 TCP 172.31.19.232:8083->172.31.18.195:35575 (CLOSE_WAIT) java 19147 chat 3758u IPv6 177097987 0t0 TCP 172.31.19.232:8083->172.31.18.195:35176 (CLOSE_WAIT) java 19147 chat 3762u IPv6 177101164 0t0 TCP 172.31.19.232:8083->172.31.18.195:35757 (CLOSE_WAIT) java 19147 chat 3763u IPv6 177101163 0t0 TCP 172.31.19.232:8083->172.31.18.195:35701 (CLOSE_WAIT)

lsof -i -n -P |grep 8083 | wc -l
3519

cat /etc/sysctl.conf
fs.file-max=360000 net.ipv4.ip_local_port_range=1024 65000 net.ipv4.tcp_keepalive_time=60 net.ipv4.tcp_keepalive_probes=3 net.ipv4.tcp_keepalive_intvl=90

cat /etc/security/limits.conf
chat soft nofile 500000 chat hard nofile 500000

cat tigase-consile.log
2016-01-21 15:40:13.459 [in_1-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-21 18:34:36.625 [in_7-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 08:26:49.028 [in_3-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 10:11:23.588 [in_7-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 12:06:28.062 [in_0-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 13:38:36.412 [in_5-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 14:16:34.856 [in_7-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 14:25:20.469 [in_6-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 16:15:10.247 [in_7-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 17:17:51.320 [in_5-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-22 18:40:20.984 [in_5-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ 2016-01-23 19:16:18.819 [in_0-c2s] TLSIO.writeBuff() WARNING: Infinite loop detected in writeBuff(buff) TLS code, tlsWrapper.getStatus(): NEED_READ

Categories: Tigase Forums

Installation and maintenance: RE: Users not seems in database when create user using Spark Client...

Tue, 01/26/2016 - 00:29

Hello,

Can you please review the log and guide us at your earliest?

Thanks,

Categories: Tigase Forums

Installation and maintenance: RE: Users not seems in database when create user using Spark Client...

Mon, 01/25/2016 - 05:46

Here, I attached complete log report of .log file.

Categories: Tigase Forums

API and development: RE: Password verification problem

Mon, 01/25/2016 - 05:33

Hi,
I have the same problem ,I use mysql5.1,that's my full logs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=64m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: Using incremental CMS is deprecated and will likely be removed in a future release
componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.xml.XMLUtils}
componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.util.ClassUtil}
componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.server.XMPPServer}
2016-01-25 21:13:18.701 [main] DNSResolver.<clinit>() WARNING: Resolving default host name: server14 took: 5
2016-01-25 21:13:18.728 [main] ConfiguratorAbstract.parseArgs() CONFIG: Setting defaults: --property-file = etc/init.properties
2016-01-25 21:13:18.729 [main] ConfiguratorAbstract.parseArgs() CONFIG: Setting defaults: --test = true
2016-01-25 21:13:18.729 [main] ConfiguratorAbstract.parseArgs() CONFIG: Loading initial properties from property file: etc/init.prop
erties
2016-01-25 21:13:18.730 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--comp-class-1=tigase.archi
ve.MessageArchiveComponent)
2016-01-25 21:13:18.730 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--data-repo-pool-size=15)
2016-01-25 21:13:18.730 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--virt-hosts=tig.g.com)
2016-01-25 21:13:18.730 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--auth-db-uri=jdbc:mysql://
localhost:3306/tigasedb?user=root&password=test&useUnicode=true&characterEncoding=UTF-8&autoCreateUser=true)
2016-01-25 21:13:18.730 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--net-buff-high-throughput=
256k)
2016-01-25 21:13:18.730 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--user-db-uri=jdbc:mysql://
192.168.55.239:3306/tigasedb?user=root&password=test&useUnicode=true&characterEncoding=UTF-8&autoCreateUser=true)
2016-01-25 21:13:18.731 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--cluster-connect-all=true)
2016-01-25 21:13:18.731 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--user-db=mysql)
2016-01-25 21:13:18.731 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--admins=)
2016-01-25 21:13:18.731 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--cluster-nodes=server07:xm
wang:5277,server13:xmwang:5277,server14:xmwang:5277,server02:xmwang:5277)
2016-01-25 21:13:18.731 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--auth-db=tigase-custom)
2016-01-25 21:13:18.731 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (config-type=--gen-config-de
f)
2016-01-25 21:13:18.731 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--comp-name-1=message-archi
ve)
2016-01-25 21:13:18.732 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--cluster-mode=true)
2016-01-25 21:13:18.732 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--sm-plugins=+message-archi
ve-xep-0136)
2016-01-25 21:13:18.732 [main] ConfiguratorAbstract.parseArgs() CONFIG: Added default config parameter: (--monitoring=jmx:9050)
2016-01-25 21:13:18.778 [main] MonitoringSetup.initMonitoring() CONFIG: Installing monitoring services: jmx:9050
2016-01-25 21:13:18.779 [main] MonitoringSetup.initMonitoring() CONFIG: Loading JMX monitor.
2016-01-25 21:13:18.908 [main] AbstractMessageReceiver.setMaxQueueSize() FINEST: message-router maxQueueSize: 10,650, maxInQueueSize: 1
,330, maxOutQueueSize: 21,300
2016-01-25 21:13:18.910 [main] MessageRouter.addRegistrator() INFO: Adding registrator: Configurator
2016-01-25 21:13:18.910 [main] MessageRouter.addComponent() INFO: Adding component:
2016-01-25 21:13:18.911 [main] ConfiguratorAbstract.componentAdded() CONFIG: component: basic-conf
2016-01-25 21:13:18.911 [main] ConfiguratorAbstract.setup() CONFIG: Setting up component: basic-conf
2016-01-25 21:13:18.913 [main] ConfiguratorAbstract.setup() CONFIG: Component basic-conf defaults: {component-id=basic-conf@serv
er14, def-hostname=server14, admins=[Ljava.lang.String;@14899482, scripts-dir=scripts/admin, command/ALL=ADMIN, logging/.level=WARNING, logging/hand
lers=java.util.logging.ConsoleHandler java.util.logging.FileHandler, logging/java.util.logging.ConsoleHandler.formatter=tigase.util.LogFormatter, lo
gging/java.util.logging.ConsoleHandler.level=WARNING, logging/java.util.logging.FileHandler.append=true, logging/java.util.logging.FileHandler.count
=5, logging/java.util.logging.FileHandler.formatter=tigase.util.LogFormatter, logging/java.util.logging.FileHandler.limit=10000000, logging/java.uti
l.logging.FileHandler.pattern=logs/tigase.log, logging/tigase.useParentHandlers=true, logging/java.util.logging.FileHandler.level=ALL, user-domain-r
epo-pool=tigase.db.UserRepositoryMDImpl, auth-domain-repo-pool=tigase.db.AuthRepositoryMDImpl, user-repo-pool-size=10, data-repo-pool-size=15, auth-
repo-pool-size=15, user-repo-class=mysql, user-repo-url=jdbc:mysql://192.168.55.239:3306/tigasedb?user=root&password=test&useUnicode=true&characterE
ncoding=UTF-8&autoCreateUser=true, auth-repo-class=tigase-custom, auth-repo-url=jdbc:mysql://localhost:3306/tigasedb?user=root&password=test&useUnic
ode=true&characterEncoding=UTF-8&autoCreateUser=true, ssl-container-class=tigase.io.SSLContextContainer, ssl-certs-location=certs/, ssl-def-cert-dom
ain=default, config-dump-file=etc/config-dump.properties}
2016-01-25 21:13:18.913 [main] ConfiguratorAbstract.setup() CONFIG: Component basic-conf configuration: {component-id=basic-conf
@server14, def-hostname=server14, admins=[Ljava.lang.String;@14899482, scripts-dir=scripts/admin, command/ALL=ADMIN, logging/.level=WARNING, logging
/handlers=java.util.logging.ConsoleHandler java.util.logging.FileHandler, logging/java.util.logging.ConsoleHandler.formatter=tigase.util.LogFormatte
r, logging/java.util.logging.ConsoleHandler.level=WARNING, logging/java.util.logging.FileHandler.append=true, logging/java.util.logging.FileHandler.
count=5, logging/java.util.logging.FileHandler.formatter=tigase.util.LogFormatter, logging/java.util.logging.FileHandler.limit=10000000, logging/jav
a.util.logging.FileHandler.pattern=logs/tigase.log, logging/tigase.useParentHandlers=true, logging/java.util.logging.FileHandler.level=ALL, user-dom
ain-repo-pool=tigase.db.UserRepositoryMDImpl, auth-domain-repo-pool=tigase.db.AuthRepositoryMDImpl, user-repo-pool-size=10, data-repo-pool-size=15,
auth-repo-pool-size=15, user-repo-class=mysql, user-repo-url=jdbc:mysql://192.168.55.239:3306/tigasedb?user=root&password=test&useUnicode=true&chara
cterEncoding=UTF-8&autoCreateUser=true, auth-repo-class=tigase-custom, auth-repo-url=jdbc:mysql://localhost:3306/tigasedb?user=root&password=test&us
eUnicode=true&characterEncoding=UTF-8&autoCreateUser=true, ssl-container-class=tigase.io.SSLContextContainer, ssl-certs-location=certs/, ssl-def-cer
t-domain=default, config-dump-file=etc/config-dump.properties}
2016-01-25 21:13:18.919 [main] ConfiguratorAbstract.setProperties() INFO: Propeties size is 33, and here are all propeties: {component
id=basic-conf@server14, def-hostname=server14, admins=[Ljava.lang.String;@14899482, scripts-dir=scripts/admin, command/ALL=ADMIN, logging/.level=WA
RNING, logging/handlers=java.util.logging.ConsoleHandler java.util.logging.FileHandler, logging/java.util.logging.ConsoleHandler.formatter=tigase.ut
il.LogFormatter, logging/java.util.logging.ConsoleHandler.level=WARNING, logging/java.util.logging.FileHandler.append=true, logging/java.util.loggin
g.FileHandler.count=5, logging/java.util.logging.FileHandler.formatter=tigase.util.LogFormatter, logging/java.util.logging.FileHandler.limit=1000000
0, logging/java.util.logging.FileHandler.pattern=logs/tigase.log, logging/tigase.useParentHandlers=true, logging/java.util.logging.FileHandler.level
=ALL, user-domain-repo-pool=tigase.db.UserRepositoryMDImpl, auth-domain-repo-pool=tigase.db.AuthRepositoryMDImpl, user-repo-pool-size=10, data-repo
pool-size=15, auth-repo-pool-size=15, user-repo-class=mysql, user-repo-url=jdbc:mysql://192.168.55.239:3306/tigasedb?user=root&password=test&useUnic
ode=true&characterEncoding=UTF-8&autoCreateUser=true, auth-repo-class=tigase-custom, auth-repo-url=jdbc:mysql://localhost:3306/tigasedb?user=root&pa
ssword=test&useUnicode=true&characterEncoding=UTF-8&autoCreateUser=true, ssl-container-class=tigase.io.SSLContextContainer, ssl-certs-location=certs
/, ssl-def-cert-domain=default, config-dump-file=etc/config-dump.properties, shared-user-repo=null, shared-user-repo-params=null, shared-auth-repo=n
ull, shared-auth-repo-params=null}
2016-01-25 21:13:21.424 [main] SimpleCache.<init>() WARNING: Tigase cache turned off.
2016-01-25 21:13:22.122 [main] VHostManager.setProperties() WARNING: {tig.g.com=Domain: tig.g.com, enabled: true, anonym: true, r
egister: true, maxusers: 0, tls: false, s2sSecret: 5214390c-c12c-426b-a940-e3bd7d480b54, domainFilter: ALL, domainFilterDomains: null, c2sPortsAllow
ed: null, saslAllowedMechanisms: null}
2016-01-25 21:13:22.287 [main] SimpleCache.<init>() WARNING: Tigase cache turned off.
Loading component: amp :: componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.cluster.AmpCompon
entClustered}
Loading component: bosh :: componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.cluster.BoshConn
ectionClustered}
Loading component: c2s :: componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.cluster.ClientCon
nectionClustered}
2016-01-25 21:13:22.563 [main] ClusterConnectionManager.itemAdded() WARNING: Incorrect ClusterRepoItem, skipping connection attempt: se
rver14:xmwang:5277:0:0.0:0.0
Loading component: cl-comp :: componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.cluster.Clust
erConnectionManager}
Loading component: message-archive :: componentInfo{Title=Tigase Message Archiving Component, Version=1.1.0-b71/c4003eb3, Class=tigase.archive.Messa
geArchiveComponent}
Loading component: monitor :: componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.cluster.Monit
orClustered}
Loading component: s2s :: componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.cluster.S2SConnec
tionClustered}
Loading plugin: session-close=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: session-open=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: default-handler=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: jabber:iq:register=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: jabber:iq:auth=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: urn:ietf:params:xml:ns:xmpp-sasl=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: urn:ietf:params:xml:ns:xmpp-bind=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: urn:ietf:params:xml:ns:xmpp-session=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: jabber:iq:roster=8:13312 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: jabber:iq:privacy=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: jabber:iq:version=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: http://jabber.org/protocol/stats=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: starttls=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: vcard-temp=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: http://jabber.org/protocol/commands=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: jabber:iq:private=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: urn:xmpp:ping=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: presence=8:13312 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: disco=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: zlib=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: amp=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: message-carbons=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
Loading plugin: message-archive-xep-0136=4:26625 ... , version: 7.0.2-b3821/563fcf81 (2015-05-15/00:41:16)
MA LOADED = message-archive@server14
Loading component: sess-man :: componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.cluster.Sess
ionManagerClustered, componentData={ClusteringStrategy=class tigase.cluster.strategy.DefaultClusteringStrategy}}
Loading component: ws2s :: componentInfo{Title=Tigase XMPP Server, Version=7.0.2-b3821/563fcf81 (2015-05-15/00:41:16), Class=tigase.cluster.WebSocke
tClientConnectionClustered}
2016-01-25 21:13:24.513 [main] ConfigurationCache.store() WARNING: Dumping server configuration to: etc/config-dump.properties
2016-01-25 21:13:29.881 [ConnectionOpenThread] SocketThread.<clinit>() WARNING: 17 socketReadThreads started.
2016-01-25 21:13:29.887 [ConnectionOpenThread] SocketThread.<clinit>() WARNING: 17 socketWriteThreads started.
2016-01-25 21:13:31.956 [in_13-cl-comp] ClusterConnectionManager.writePacketToSocket() WARNING: No cluster connection to send a packet: from=nul
l, to=null, DATA=<cluster xmlns="tigase:cluster" type="set" id="cl-3" to="sess-man@server02" from="sess-man@server02"><control><visited-nodes><node-
id>sess-man@server02</node-id><node-id>sess-man@server13</node-id><node-id>sess-man@server07</node-id><node-id>sess-man@server14</node-id></visited-
nodes><method-call name="sess-man-packet-forward-sm-cmd"/><first-node>sess-man@server02</first-node></control><data><message type="chat" to="admin@t
ig.g.com" xmlns="jabber:client" id="sess-man2" from=""><body>Cluster node server07 connected to server02 (Mon Jan 25 21:13:31 CST
2016)</body><thread>cluster_status_update</thread></message></data></cluster>, SIZE=652, XMLNS=tigase:cluster, PRIORITY=CLUSTER, PERMISSION=NONE, TY
PE=set
2016-01-25 21:19:47.478 [jabber:iq:auth Queue Worker 0] JabberIqAuth.doAuth() WARNING: Can't authenticate with given CallbackHandler
java.io.IOException: Password verification problem.
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handleVerifyPasswordCallback(AuthRepoPlainCallbackHandler.java:141)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handleCallback(AuthRepoPlainCallbackHandler.java:94)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handle(AuthRepoPlainCallbackHandler.java:66)
at tigase.xmpp.impl.JabberIqAuth.doAuth(JabberIqAuth.java:319)
at tigase.xmpp.impl.JabberIqAuth.process(JabberIqAuth.java:237)
at tigase.server.xmppsession.SessionManager$ProcessorWorkerThread.process(SessionManager.java:2440)
at tigase.util.WorkerThread.run(WorkerThread.java:128)
Caused by: tigase.db.UserNotFoundException: User does not exist: , in database: jdbc:mysql://localhost:3306/tigasedb?user=root&
password=test&useUnicode=true&characterEncoding=UTF-8&autoCreateUser=true
at tigase.db.jdbc.TigaseCustomAuth.userLoginAuth(TigaseCustomAuth.java:887)
at tigase.db.jdbc.TigaseCustomAuth.plainAuth(TigaseCustomAuth.java:618)
at tigase.db.jdbc.TigaseCustomAuth.otherAuth(TigaseCustomAuth.java:601)
at tigase.db.AuthRepositoryMDImpl.otherAuth(AuthRepositoryMDImpl.java:173)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handleVerifyPasswordCallback(AuthRepoPlainCallbackHandler.java:134)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handleCallback(AuthRepoPlainCallbackHandler.java:94)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handle(AuthRepoPlainCallbackHandler.java:66)
at tigase.xmpp.impl.JabberIqAuth.doAuth(JabberIqAuth.java:319)
at tigase.xmpp.impl.JabberIqAuth.process(JabberIqAuth.java:237)
at tigase.server.xmppsession.SessionManager$ProcessorWorkerThread.process(SessionManager.java:2440)
at tigase.util.WorkerThread.run(WorkerThread.java:128)
2016-01-25 21:21:55.685 [jabber:iq:auth Queue Worker 2] JabberIqAuth.doAuth() WARNING: Can't authenticate with given CallbackHandler
java.io.IOException: Password verification problem.
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handleVerifyPasswordCallback(AuthRepoPlainCallbackHandler.java:141)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handleCallback(AuthRepoPlainCallbackHandler.java:94)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handle(AuthRepoPlainCallbackHandler.java:66)
at tigase.xmpp.impl.JabberIqAuth.doAuth(JabberIqAuth.java:319)
at tigase.xmpp.impl.JabberIqAuth.process(JabberIqAuth.java:237)
at tigase.server.xmppsession.SessionManager$ProcessorWorkerThread.process(SessionManager.java:2440)
at tigase.util.WorkerThread.run(WorkerThread.java:128)
Caused by: tigase.db.UserNotFoundException: User does not exist: , in database: jdbc:mysql://localhost:3306/tigasedb?user=root&
password=test&useUnicode=true&characterEncoding=UTF-8&autoCreateUser=true
at tigase.db.jdbc.TigaseCustomAuth.userLoginAuth(TigaseCustomAuth.java:887)
at tigase.db.jdbc.TigaseCustomAuth.plainAuth(TigaseCustomAuth.java:618)
at tigase.db.jdbc.TigaseCustomAuth.otherAuth(TigaseCustomAuth.java:601)
at tigase.db.AuthRepositoryMDImpl.otherAuth(AuthRepositoryMDImpl.java:173)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handleVerifyPasswordCallback(AuthRepoPlainCallbackHandler.java:134)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handleCallback(AuthRepoPlainCallbackHandler.java:94)
at tigase.auth.impl.AuthRepoPlainCallbackHandler.handle(AuthRepoPlainCallbackHandler.java:66)
at tigase.xmpp.impl.JabberIqAuth.doAuth(JabberIqAuth.java:319)
at tigase.xmpp.impl.JabberIqAuth.process(JabberIqAuth.java:237)
at tigase.server.xmppsession.SessionManager$ProcessorWorkerThread.process(SessionManager.java:2440)
at tigase.util.WorkerThread.run(WorkerThread.java:128)

wait for your help,thanks

Categories: Tigase Forums

Installation and maintenance: RE: Unable to login Web Client

Thu, 01/21/2016 - 05:42

Fixed build of web client is part of Tigase XMPP Server 7.1.0-SNAPSHOT builds, so you can use it by updating installation to newer build of Tigase XMPP Server 7.1.0-SNAPSHOT

Categories: Tigase Forums

Installation and maintenance: RE: Unable to login Web Client

Thu, 01/21/2016 - 05:39

Hi, Thanks for your reply. Where can I find the fix for error in handling see-other-host by web client code?

Categories: Tigase Forums

Installation and maintenance: RE: Users not seems in database when create user using Spark Client...

Thu, 01/21/2016 - 03:19

Sorry but i found the tigase-console.log file AT logs/tigase-console.log

Here, I attached log report of .log file.

Following are the steps i made enabled while installation server:
Advanced config.

1. Separate Auth DB : off
2. MUC : On
3. Pubsub : On

rest are off.

Categories: Tigase Forums

Installation and maintenance: RE: Users not seems in database when create user using Spark Client...

Thu, 01/21/2016 - 03:10

OK, what changes are you making exactly? Only enabling PubSub?

Can you share etc/tigase-console.log after the startup?

Categories: Tigase Forums

Pages

Get in touch

We provide software products, consulting and custom development services

Tigase, Inc.
100 Pine Street, Suite 1250
San Francisco, CA 94111, USA
Phone: (415) 315 9771

Follow us on:

Twitter

Back to Top