Locally created st2 packs falling on stackstorm HA in kubernetes

Locally created st2 packs falling on stackstorm HA in kubernetes.

The pack is working fine on standalone server but facing issue on docker image created in stackstorm HA in kubernetes.
created st2 pack with docker build ,pushed and pulled the image through helm but facing issue
pymongo.errors.ServerSelectionTimeoutError: 127.0.0.1:27017: [Errno 111] ECONNREFUSED

2020-12-04 14:01:32,140 DEBUG [-] Using Python: 3.6.9 (/opt/stackstorm/st2/bin/python)
2020-12-04 14:01:32,141 DEBUG [-] Using config files: /etc/st2/st2.conf
2020-12-04 14:01:32,142 DEBUG [-] Using logging config: /etc/st2/logging.sensorcontainer.conf
2020-12-04 14:01:32,151 INFO [-] Connecting to database “st2” @ “127.0.0.1:27017” as user “None”.
2020-12-04 14:01:35,158 ERROR [-] Failed to connect to database “st2” @ “127.0.0.1:27017” as user “None”: 127.0.0.1:27017: [Errno 111] ECONNREFUSED
2020-12-04 14:01:35,158 ERROR [-] (PID:191) SensorContainer quit due to exception.
Traceback (most recent call last):
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/cmd/sensormanager.py”, line 61, in main
_setup()
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/cmd/sensormanager.py”, line 52, in _setup
register_runners=False, service_registry=True, capabilities=capabilities)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/service_setup.py”, line 162, in setup
db_setup()
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/database_setup.py”, line 57, in db_setup
connection = db_init.db_setup_with_retry(**db_cfg)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/persistence/db_init.py”, line 76, in db_setup_with_retry
ssl_match_hostname=ssl_match_hostname)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/persistence/db_init.py”, line 59, in db_func_with_retry
return retrying_obj.call(db_func, *args, **kwargs)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/retrying.py”, line 206, in call
return attempt.get(self._wrap_exception)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/retrying.py”, line 247, in get
six.reraise(self.value[0], self.value[1], self.value[2])
File “/opt/stackstorm/st2/lib/python3.6/site-packages/six.py”, line 696, in reraise
raise value
File “/opt/stackstorm/st2/lib/python3.6/site-packages/retrying.py”, line 200, in call
attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/models/db/init.py”, line 170, in db_setup
ssl_match_hostname=ssl_match_hostname)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/models/db/init.py”, line 152, in _db_connect
raise e
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/models/db/init.py”, line 145, in _db_connect
connection.admin.command(‘ismaster’)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/pymongo/database.py”, line 730, in command
read_preference, session) as (sock_info, slave_ok):
File “/usr/lib/python3.6/contextlib.py”, line 81, in enter
return next(self.gen)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/pymongo/mongo_client.py”, line 1298, in _socket_for_reads
server = self._select_server(read_preference, session)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/pymongo/mongo_client.py”, line 1253, in _select_server
server = topology.select_server(server_selector)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/pymongo/topology.py”, line 235, in select_server
address))
File “/opt/stackstorm/st2/lib/python3.6/site-packages/pymongo/topology.py”, line 193, in select_servers
selector, server_timeout, address)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/pymongo/topology.py”, line 209, in _select_servers_loop
self._error_message(selector))
pymongo.errors.ServerSelectionTimeoutError: 127.0.0.1:27017: [Errno 111] ECONNREFUSED

You’re doing something wrong as your environment uses 127.0.0.1:27017 localhost for MongoDB connection instead of MongoDB-HA cluster.

2020-12-04 14:01:32,141 DEBUG [-] Using config files: /etc/st2/st2.conf

shows it uses only st2.conf while stackstorm-ha chains several st2 configuration files optimized for K8s environment.


Which Pod is that? Is it only a Sensor that’s failing or any pack execution? How do you run Sensors, which Helm settings did you configure, were there any modifications to base Helm chart?

Hi Eugen,

The pack configured is modified servicenow_pack in stackstorm HA in kubernetes.

The application team are using servicenow_pack to send a failure message from application
which creates a serive now ticket from stackstorm.

The servicenow_pack is working on stanalone system but facing issue on HA

1)Which Pod is that?

trying to run the command
/opt/stackstorm/st2/bin/st2sensorcontainer --config-file=/etc/st2/st2.conf --debug --sensor-ref=servicenow_pack.DirectorySensor

2)Is it only a Sensor that’s failing or any pack execution

pack execution and sensor is falling

3)which Helm settings did you configure, were there any modifications to base Helm chart

The helm configuration had modification of mongo-db and rabbitmq from db team for troubleshooting used the default helm value and modified
only the st2 pack contents for troubleshooting same issue.

##########stackstorm HA ############

root@stackstorm-poc-st2client-686d789fbd-tcvr4:/opt/stackstorm/packs/servicenow_pack# /opt/stackstorm/st2/bin/st2sensorcontainer --debug --sensor-ref=servicenow_pack.DirectorySensor 2020-12-07 13:51:00,620 DEBUG [-] Using Python: 3.6.9 (/opt/stackstorm/st2/bin/python)
2020-12-07 13:51:00,621 DEBUG [-] Using config files: /etc/st2/st2.conf
2020-12-07 13:51:00,621 DEBUG [-] Using logging config: /etc/st2/logging.sensorcontainer.conf
2020-12-07 13:51:00,631 INFO [-] Connecting to database “st2” @ “127.0.0.1:27017” as user “None”.
2020-12-07 13:51:03,638 ERROR [-] Failed to connect to database “st2” @ “127.0.0.1:27017” as user “None”: 127.0.0.1:27017: [Errno 111] ECONNREFUSED
2020-12-07 13:51:03,638 ERROR [-] (PID:1241) SensorContainer quit due to exception.
Traceback (most recent call last):
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/cmd/sensormanager.py”, line 61, in main
_setup()
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/cmd/sensormanager.py”, line 52, in _setup
register_runners=False, service_registry=True, capabilities=capabilities)

####################

########standalone ##########
/opt/stackstorm/st2/bin/st2sensorcontainer --config-file=/etc/st2/st2.conf --debug --sensor-ref=servicenow_pack.DirectorySensor
2020-12-07 07:02:36,679 DEBUG [-] Using Python: 2.7.5 (/opt/stackstorm/st2/bin/python)
2020-12-07 07:02:36,679 DEBUG [-] Using config files: /etc/st2/st2.conf
2020-12-07 07:02:36,679 DEBUG [-] Using logging config: /etc/st2/logging.sensorcontainer.conf
2020-12-07 07:02:36,689 INFO [-] Connecting to database “st2” @ “127.0.0.1:27017” as user “None”.
2020-12-07 07:02:36,693 INFO [-] Successfully connected to database “st2” @ “127.0.0.1:27017” as user “None”.
2020-12-07 07:02:36,693 DEBUG [-] Ensuring database indexes…
2020-12-07 07:02:36,806 DEBUG [-] Skipping index cleanup for blacklisted model “PermissionGrantDB”…
2020-12-07 07:02:36,861 DEBUG [-] Indexes are ensured for models: ActionAliasDB, ActionAliasDB, ActionDB, ActionExecutionDB, ActionExecutionDB, ActionExecutionOutputDB, ActionExecutionSchedulingQueueItemDB, ActionExecutionStateDB, ActionExecutionStateDB, ApiKeyDB, ConfigDB, ConfigSchemaDB, GroupToRoleMappingDB, KeyValuePairDB, LiveActionDB, LiveActionDB, PackDB, PermissionGrantDB, PolicyDB, PolicyTypeDB, RoleDB, RuleDB, RuleEnforcementDB, RunnerTypeDB, RunnerTypeDB, SensorTypeDB, TaskExecutionDB, TokenDB, TraceDB, TriggerDB, TriggerInstanceDB, TriggerTypeDB, UserDB, UserRoleAssignmentDB, WorkflowExecutionDB
2020-12-07 07:02:36,861 DEBUG [-] Registering exchanges…
2020-12-07 07:02:36,862 DEBUG [-] Using SSL context for RabbitMQ connection: {}
2020-12-07 07:02:36,871 DEBUG [-] Start from server, version: 0.9, properties: {‘information’: ‘Licensed under the MPL. See http://www.rabbitmq.com/’, ‘product’: ‘RabbitMQ’, ‘copyright’: ‘Copyright © 2007-2014 GoPivotal, Inc.’, ‘capabilities’: {‘exchange_exchange_bindings’: True, ‘connection.blocked’: True, ‘authentication_failure_close’: True, ‘basic.nack’: True, ‘per_consumer_qos’: True, ‘consumer_priorities’: True, ‘consumer_cancel_notify’: True, ‘publisher_confirms’: True}, ‘cluster_name’: ‘rabbit@dc1udatast201.abacus-us.com’, ‘platform’: ‘Erlang/OTP’, ‘version’: ‘3.3.5’}, mechanisms: [‘AMQPLAIN’, ‘PLAIN’], locales: [u’en_US’]
2020-12-07 07:02:36,873 DEBUG [-] using channel_id: 1
2020-12-07 07:02:36,873 DEBUG [-] Channel open
2020-12-07 07:02:36,874 DEBUG [-] Registered exchange st2.actionexecutionstate ({‘nowait’: False, ‘exchange’: ‘st2.actionexecutionstate’, 'dur

################

OK, so looks like you’re trying to run the sensor manually from the st2client.

How about the st2sensorcontainer Pod? This is where your sensors should be deployed automatically. Is the Pod running? What do you see for the st2sensorcontainer Pods logs?

No,I think the sensor is getting stopped ,once I logout of st2 client

docker ps -a |grep -i st2sensorcontainer

10b942fdc9d3 ec0a99de65f3 “/opt/stackstorm/st2…” 2 minutes ago Up 2 minutes k8s_st2sensorcontainer_poc-stackstorm-st2sensorcontainer-645fdccb65-hgw7x_stackstorm_7f9b8f70-12f5-4230-a4b4-8b59779a471f_0

3fd0124553e7 41bc819be721 “sh -ec '/bin/cp -aR…” 2 minutes ago Exited (0) 2 minutes ago k8s_st2-system-packs_poc-stackstorm-st2sensorcontainer-645fdccb65-hgw7x_stackstorm_7f9b8f70-12f5-4230-a4b4-8b59779a471f_0

bdb4d8365391 nithinsubbaraj/servicenow_pack “sh -ec '/bin/cp -aR…” 2 minutes ago Exited (0) 2 minutes ago k8s_st2-custom-packs_poc-stackstorm-st2sensorcontainer-645fdccb65-hgw7x_stackstorm_7f9b8f70-12f5-4230-a4b4-8b59779a471f_0

987b97bd51d0 rancher/pause:3.1 “/pause” 7 minutes ago Up 7 minutes k8s_POD_poc-stackstorm-st2sensorcontainer-645fdccb65-hgw7x_stackstorm_7f9b8f70-12f5-4230-a4b4-8b59779a471f_0

Please post the logs from the st2sensorcontainer Pod. There should be an indicator in there why it’s stopped/failed.

Thanks Eugen for helping on this ,

Please find the below logs for st2sensorcontainer Pod [it is too long]

kubectl logs poc-stackstorm-st2sensorcontainer-645fdccb65-qpbwz
2020-12-08 10:46:56,990 DEBUG [-] Using Python: 3.6.9 (/opt/stackstorm/st2/bin/python)
2020-12-08 10:46:56,990 DEBUG [-] Using config files: /etc/st2/st2.conf,/etc/st2/st2.docker.conf,/etc/st2/st2.user.conf
2020-12-08 10:46:56,990 DEBUG [-] Using logging config: /etc/st2/logging.sensorcontainer.conf
2020-12-08 10:46:57,006 INFO [-] Connecting to database “st2” @ “poc-stackstorm-mongodb-ha-0.poc-stackstorm-mongodb-ha:27017,poc-stackstorm-mongodb-ha-1.poc-stackstorm-mongodb-ha:27017,poc-stackstorm-mongodb-ha-2.poc-stackstorm-mongodb-ha:27017 (replica set)” as user “admin”.
2020-12-08 10:46:57,072 INFO [-] Successfully connected to database “st2” @ “poc-stackstorm-mongodb-ha-0.poc-stackstorm-mongodb-ha:27017,poc-stackstorm-mongodb-ha-1.poc-stackstorm-mongodb-ha:27017,poc-stackstorm-mongodb-ha-2.poc-stackstorm-mongodb-ha:27017 (replica set)” as user “admin”.
2020-12-08 10:46:57,496 INFO [-] Using partitioner default with sensornode sensornode1.
2020-12-08 10:46:57,510 INFO [-] Found 2 registered sensors in db scan.
2020-12-08 10:46:57,512 INFO [-] Setting up container to run 2 sensors.
2020-12-08 10:46:57,512 INFO [-] Sensors list - [‘linux.FileWatchSensor’, ‘servicenow_pack.DirectorySensor’].
2020-12-08 10:46:57,512 INFO [-] (PID:1) SensorContainer started.
2020-12-08 10:46:57,513 INFO [-] Running sensor linux.FileWatchSensor
2020-12-08 10:46:57,525 INFO [-] Connected to amqp://admin:@poc-stackstorm-rabbitmq-ha-discovery:5672//
2020-12-08 10:46:57,572 INFO [-] Sensor linux.FileWatchSensor started
2020-12-08 10:46:57,573 INFO [-] Running sensor servicenow_pack.DirectorySensor
2020-12-08 10:46:57,587 INFO [-] Sensor servicenow_pack.DirectorySensor started
2020-12-08 10:46:59,400 INFO [-] No config found for sensor “FileWatchSensor”
2020-12-08 10:46:59,401 INFO [-] Watcher started
2020-12-08 10:46:59,402 INFO [-] Running sensor initialization code
2020-12-08 10:46:59,409 INFO [-] Running sensor in passive mode
2020-12-08 10:46:59,438 INFO [-] Connected to amqp://admin:
@poc-stackstorm-rabbitmq-ha-discovery:5672//
2020-12-08 10:46:59,504 ERROR [-] Traceback (most recent call last):
2020-12-08 10:46:59,504 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 299, in _get_sensor_instance

2020-12-08 10:46:59,504 ERROR [-]
2020-12-08 10:46:59,504 ERROR [-] class_name=self._class_name)
2020-12-08 10:46:59,505 ERROR [-]

2020-12-08 10:46:59,505 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/util/loader.py”, line 148, in register_plugin_class

2020-12-08 10:46:59,505 ERROR [-]
2020-12-08 10:46:59,505 ERROR [-] module = imp.load_source(module_name, file_path)
2020-12-08 10:46:59,505 ERROR [-]

2020-12-08 10:46:59,505 ERROR [-] File “/opt/stackstorm/virtualenvs/servicenow_pack/lib/python3.6/imp.py”, line 172, in load_source

2020-12-08 10:46:59,505 ERROR [-]
2020-12-08 10:46:59,505 ERROR [-] module = _load(spec)
2020-12-08 10:46:59,505 ERROR [-]

2020-12-08 10:46:59,505 ERROR [-] File “”, line 684, in _load

2020-12-08 10:46:59,505 ERROR [-] File “”, line 665, in _load_unlocked

2020-12-08 10:46:59,506 ERROR [-] File “”, line 678, in exec_module

2020-12-08 10:46:59,506 ERROR [-] File “”, line 219, in _call_with_frames_removed

2020-12-08 10:46:59,506 ERROR [-] File “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py”, line 10, in

2020-12-08 10:46:59,506 ERROR [-]
2020-12-08 10:46:59,506 ERROR [-] class DirectorySensor(PollingSensor):
2020-12-08 10:46:59,506 ERROR [-]

2020-12-08 10:46:59,506 ERROR [-] File “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py”, line 16, in DirectorySensor

2020-12-08 10:46:59,506 ERROR [-]
2020-12-08 10:46:59,506 ERROR [-] handler = TimedRotatingFileHandler(logfile, when=“midnight”, interval=1)
2020-12-08 10:46:59,506 ERROR [-]

2020-12-08 10:46:59,506 ERROR [-] File “/usr/lib/python3.6/logging/handlers.py”, line 202, in init

2020-12-08 10:46:59,507 ERROR [-]
2020-12-08 10:46:59,507 ERROR [-] BaseRotatingHandler.init(self, filename, ‘a’, encoding, delay)
2020-12-08 10:46:59,507 ERROR [-]

2020-12-08 10:46:59,507 ERROR [-] File “/usr/lib/python3.6/logging/handlers.py”, line 57, in init

2020-12-08 10:46:59,507 ERROR [-]
2020-12-08 10:46:59,507 ERROR [-] logging.FileHandler.init(self, filename, mode, encoding, delay)
2020-12-08 10:46:59,507 ERROR [-]

2020-12-08 10:46:59,507 ERROR [-] File “/usr/lib/python3.6/logging/init.py”, line 1032, in init

2020-12-08 10:46:59,507 ERROR [-]
2020-12-08 10:46:59,507 ERROR [-] StreamHandler.init(self, self._open())
2020-12-08 10:46:59,507 ERROR [-]

2020-12-08 10:46:59,507 ERROR [-] File “/usr/lib/python3.6/logging/init.py”, line 1061, in _open

2020-12-08 10:46:59,508 ERROR [-]
2020-12-08 10:46:59,508 ERROR [-] return open(self.baseFilename, self.mode, encoding=self.encoding)
2020-12-08 10:46:59,508 ERROR [-]

2020-12-08 10:46:59,508 ERROR [-] FileNotFoundError
2020-12-08 10:46:59,508 ERROR [-] :
2020-12-08 10:46:59,508 ERROR [-] [Errno 2] No such file or directory: ‘/opt/stackstorm/packs/service_now/st2_log/sensor_log/sn_sensor.log’
2020-12-08 10:46:59,508 ERROR [-]

2020-12-08 10:46:59,508 ERROR [-]
During handling of the above exception, another exception occurred:

2020-12-08 10:46:59,508 ERROR [-] Traceback (most recent call last):

2020-12-08 10:46:59,508 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 371, in

2020-12-08 10:46:59,508 ERROR [-]
2020-12-08 10:46:59,508 ERROR [-] parent_args=parent_args)
2020-12-08 10:46:59,508 ERROR [-]

2020-12-08 10:46:59,508 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 220, in init

2020-12-08 10:46:59,508 ERROR [-]
2020-12-08 10:46:59,509 ERROR [-] self._sensor_instance = self._get_sensor_instance()
2020-12-08 10:46:59,509 ERROR [-]

2020-12-08 10:46:59,509 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 306, in _get_sensor_instance

2020-12-08 10:46:59,509 ERROR [-]
2020-12-08 10:46:59,509 ERROR [-] raise exc_cls(msg)
2020-12-08 10:46:59,509 ERROR [-]

2020-12-08 10:46:59,509 ERROR [-] FileNotFoundError
2020-12-08 10:46:59,509 ERROR [-] :
2020-12-08 10:46:59,509 ERROR [-] Failed to load sensor class from file “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py” (sensor file most likely doesn’t exist or contains invalid syntax): [Errno 2] No such file or directory: ‘/opt/stackstorm/packs/service_now/st2_log/sensor_log/sn_sensor.log’

Traceback (most recent call last):
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 299, in _get_sensor_instance
class_name=self._class_name)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/util/loader.py”, line 148, in register_plugin_class
module = imp.load_source(module_name, file_path)
File “/opt/stackstorm/virtualenvs/servicenow_pack/lib/python3.6/imp.py”, line 172, in load_source
module = _load(spec)
File “”, line 684, in _load
File “”, line 665, in _load_unlocked
File “”, line 678, in exec_module
File “”, line 219, in _call_with_frames_removed
File “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py”, line 10, in
class DirectorySensor(PollingSensor):
File “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py”, line 16, in DirectorySensor
handler = TimedRotatingFileHandler(logfile, when=“midnight”, interval=1)
File “/usr/lib/python3.6/logging/handlers.py”, line 202, in init
BaseRotatingHandler.init(self, filename, ‘a’, encoding, delay)
File “/usr/lib/python3.6/logging/handlers.py”, line 57, in init
logging.FileHandler.init(self, filename, mode, encoding, delay)
File “/usr/lib/python3.6/logging/init.py”, line 1032, in init
StreamHandler.init(self, self._open())
File “/usr/lib/python3.6/logging/init.py”, line 1061, in _open
return open(self.baseFilename, self.mode, encoding=self.encoding)
FileNotFoundError: [Errno 2] No such file or directory: ‘/opt/stackstorm/packs/service_now/st2_log/sensor_log/sn_sensor.log’

2020-12-08 10:47:10,029 ERROR [-]

2020-12-08 10:47:10,032 INFO [-] Connected to amqp://admin:**@poc-stackstorm-rabbitmq-ha-discovery:5672//
2020-12-08 10:47:12,599 INFO [-] Process for sensor servicenow_pack.DirectorySensor has exited with code 1
2020-12-08 10:47:14,989 ERROR [-] Traceback (most recent call last):

2020-12-08 10:47:14,989 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 299, in _get_sensor_instance

2020-12-08 10:47:14,989 ERROR [-]
2020-12-08 10:47:14,989 ERROR [-] class_name=self._class_name)
2020-12-08 10:47:14,989 ERROR [-]

2020-12-08 10:47:14,989 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/util/loader.py”, line 148, in register_plugin_class

2020-12-08 10:47:14,989 ERROR [-]
2020-12-08 10:47:14,990 ERROR [-] module = imp.load_source(module_name, file_path)
2020-12-08 10:47:14,990 ERROR [-]

2020-12-08 10:47:14,990 ERROR [-] File “/opt/stackstorm/virtualenvs/servicenow_pack/lib/python3.6/imp.py”, line 172, in load_source

2020-12-08 10:47:14,990 ERROR [-]
2020-12-08 10:47:14,990 ERROR [-] module = _load(spec)
2020-12-08 10:47:14,990 ERROR [-]

2020-12-08 10:47:14,990 ERROR [-] File “”, line 684, in _load

2020-12-08 10:47:14,990 ERROR [-] File “”, line 665, in _load_unlocked

2020-12-08 10:47:14,990 ERROR [-] File “”, line 678, in exec_module

2020-12-08 10:47:14,990 ERROR [-] File “”, line 219, in _call_with_frames_removed

2020-12-08 10:47:14,990 ERROR [-] File “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py”, line 10, in

2020-12-08 10:47:14,991 ERROR [-]
2020-12-08 10:47:14,991 ERROR [-] class DirectorySensor(PollingSensor):
2020-12-08 10:47:14,991 ERROR [-]

2020-12-08 10:47:14,991 ERROR [-] File “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py”, line 16, in DirectorySensor

2020-12-08 10:47:14,991 ERROR [-]
2020-12-08 10:47:14,991 ERROR [-] handler = TimedRotatingFileHandler(logfile, when=“midnight”, interval=1)
2020-12-08 10:47:14,991 ERROR [-]

2020-12-08 10:47:14,991 ERROR [-] File “/usr/lib/python3.6/logging/handlers.py”, line 202, in init

2020-12-08 10:47:14,991 ERROR [-]
2020-12-08 10:47:14,991 ERROR [-] BaseRotatingHandler.init(self, filename, ‘a’, encoding, delay)
2020-12-08 10:47:14,991 ERROR [-]

2020-12-08 10:47:14,991 ERROR [-] File “/usr/lib/python3.6/logging/handlers.py”, line 57, in init

2020-12-08 10:47:14,991 ERROR [-]
2020-12-08 10:47:14,991 ERROR [-] logging.FileHandler.init(self, filename, mode, encoding, delay)
2020-12-08 10:47:14,992 ERROR [-]

2020-12-08 10:47:14,992 ERROR [-] File “/usr/lib/python3.6/logging/init.py”, line 1032, in init

2020-12-08 10:47:14,992 ERROR [-]
2020-12-08 10:47:14,992 ERROR [-] StreamHandler.init(self, self._open())
2020-12-08 10:47:14,992 ERROR [-]

2020-12-08 10:47:14,992 ERROR [-] File “/usr/lib/python3.6/logging/init.py”, line 1061, in _open

2020-12-08 10:47:14,992 ERROR [-]
2020-12-08 10:47:14,992 ERROR [-] return open(self.baseFilename, self.mode, encoding=self.encoding)
2020-12-08 10:47:14,993 ERROR [-]

2020-12-08 10:47:14,993 ERROR [-] FileNotFoundError
2020-12-08 10:47:14,994 ERROR [-] :
2020-12-08 10:47:14,994 ERROR [-] [Errno 2] No such file or directory: ‘/opt/stackstorm/packs/service_now/st2_log/sensor_log/sn_sensor.log’
2020-12-08 10:47:14,994 ERROR [-]

2020-12-08 10:47:14,994 ERROR [-]
During handling of the above exception, another exception occurred:

2020-12-08 10:47:14,994 ERROR [-] Traceback (most recent call last):

2020-12-08 10:47:14,994 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 371, in

2020-12-08 10:47:14,994 ERROR [-]
2020-12-08 10:47:14,994 ERROR [-] parent_args=parent_args)
2020-12-08 10:47:14,994 ERROR [-]

2020-12-08 10:47:14,994 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 220, in init

2020-12-08 10:47:14,994 ERROR [-]
2020-12-08 10:47:14,994 ERROR [-] self._sensor_instance = self._get_sensor_instance()
2020-12-08 10:47:14,994 ERROR [-]

2020-12-08 10:47:14,994 ERROR [-] File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 306, in _get_sensor_instance

2020-12-08 10:47:14,995 ERROR [-]
2020-12-08 10:47:14,995 ERROR [-] raise exc_cls(msg)
2020-12-08 10:47:14,995 ERROR [-]

2020-12-08 10:47:14,995 ERROR [-] FileNotFoundError
2020-12-08 10:47:14,995 ERROR [-] :
2020-12-08 10:47:14,995 ERROR [-] Failed to load sensor class from file “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py” (sensor file most likely doesn’t exist or contains invalid syntax): [Errno 2] No such file or directory: ‘/opt/stackstorm/packs/service_now/st2_log/sensor_log/sn_sensor.log’

Traceback (most recent call last):
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2reactor/container/sensor_wrapper.py”, line 299, in _get_sensor_instance
class_name=self._class_name)
File “/opt/stackstorm/st2/lib/python3.6/site-packages/st2common/util/loader.py”, line 148, in register_plugin_class
module = imp.load_source(module_name, file_path)
File “/opt/stackstorm/virtualenvs/servicenow_pack/lib/python3.6/imp.py”, line 172, in load_source
module = _load(spec)
File “”, line 684, in _load
File “”, line 665, in _load_unlocked
File “”, line 678, in exec_module
File “”, line 219, in _call_with_frames_removed
File “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py”, line 10, in
class DirectorySensor(PollingSensor):
File “/opt/stackstorm/packs/servicenow_pack/sensors/directory_sensor.py”, line 16, in DirectorySensor
handler = TimedRotatingFileHandler(logfile, when=“midnight”, interval=1)
File “/usr/lib/python3.6/logging/handlers.py”, line 202, in init
BaseRotatingHandler.init(self, filename, ‘a’, encoding, delay)
File “/usr/lib/python3.6/logging/handlers.py”, line 57, in init
logging.FileHandler.init(self, filename, mode, encoding, delay)
File “/usr/lib/python3.6/logging/init.py”, line 1032, in init
StreamHandler.init(self, self._open())
File “/usr/lib/python3.6/logging/init.py”, line 1061, in _open
return open(self.baseFilename, self.mode, encoding=self.encoding)
FileNotFoundError: [Errno 2] No such file or directory: ‘/opt/stackstorm/packs/service_now/st2_log/sensor_log/sn_sensor.log’

2020-12-08 10:47:14,995 ERROR [-]

2020-12-08 10:47:17,607 INFO [-] Process for sensor servicenow_pack.DirectorySensor has exited with code 1

FileNotFoundError: [Errno 2] No such file or directory: ‘/opt/stackstorm/packs/service_now/st2_log/sensor_log/sn_sensor.log’

OK, you can see now why sensor is failing. What’s the /opt/stackstorm/packs/service_now/st2_log/sensor_log/sn_sensor.log? It’s not recommended to write into the st2sensorcontainer file system during the execution is it a log file or anything else. Instead of that, use external/remote system for storing logs.

Thanks we are figuring out how to use external/remote system for storing logs and trying to troubleshoot