Title: | DECmcc user notes file. Does not replace IPMT. |
Notice: | Use IPMT for problems. Newsletter location in note 6187 |
Moderator: | TAEC::BEROUD |
Created: | Mon Aug 21 1989 |
Last Modified: | Wed Jun 04 1997 |
Last Successful Update: | Fri Jun 06 1997 |
Number of topics: | 6497 |
Total number of notes: | 27359 |
Could someone explain me why sometime an exporting go in SUSPEND state ?? I experienced this case in some different ways, sometime happened that on a total of 50 entities (for example) I found 40 entities in ACTIVE state and 10 in SUSPEND. Futhermore on my test Vax after a MCC shutdown/reboot, or after killing the background process, or after a VMS shutdown/reboot all the exporting are in SUSPEND state and I have to RESUME all of them. On another MCC (the Vax use to control Easynet, same HW+SW configuration) all works fine. Thanks in advance Ciao Luciano Barilaro
T.R | Title | User | Personal Name | Date | Lines |
---|---|---|---|---|---|
2765.1 | TOOK::SHMUYLOVICH | Wed Apr 15 1992 10:17 | 9 | ||
In Exporter V1.1 after 10 consecutive unsuccessful polls Exporting request goes to the suspended state. Using Show Exporting you can find the reason of the last failed polls. Exporter V1.2 does not do this. Sam | |||||
2765.2 | So, why in SUSPEND state after shutdown/reboot | MLNCSC::BARILARO | Wed Apr 15 1992 11:25 | 13 | |
Yes, I check on my test MCC and the exporting in SUSPEND state have 10 (or more) failed polls. BUT, my real and big problem is why all the exporting goes in SUSPEND state after the dead of background process, or after a shutdown/reboot, so that I need to RESUME all. As I wrote on another MCC, same configuration, same SW (the only difference is the number of exporting) all works fine after shutdown/reboot. Thanks, Luciano | |||||
2765.3 | TOOK::SHMUYLOVICH | Wed Apr 15 1992 14:28 | 41 | ||
The background restart procedure does not change the state of the exporting request. So it could be only two possible situations: 1. Exporting requests are in suspended state before the background process dies; 2. The restart procedure starts exportings in the active state but after a while they are suspended. The Show Export command has several arguments which allow to understand the situation you have. These arguments are: - "State since" - depending on the value of this argument you can find if exporting has been suspended before or after the background process restart; - "Time of last successful poll" and "Time of last failed poll" - tell if this exporting was active after the background process restart and if there were any successful polls. Or you can do the following experiment: 1. Show Exporting to be sure that it's active; 2. Kill the background; 3. Start the background; 4. Show Exporting ( if state is active repeat this command until it becomes suspended or until at least one successful poll). In the Exporter V1.1 there is a potential restart problem in the case when the number of the active exportings is big and an exporting period is small. The restart procedure forces all active exportings to start simultaneously. It means that they compete for some of the resources. If "waiting" time is bigger that 10 exporting periods the corresponding request will be suspended. Exporter V1.2 is free from this problem. BTW, what entity classes are you exporting and what is the export period? Sam | |||||
2765.4 | Process in HIB state | MLNCSC::BARILARO | Tue Apr 21 1992 13:51 | 16 | |
Sorry for the delay in my answer, I tried to make the checks did you suggest, BUT now appear a side problem, I tried to RESUME the exporting that I previously SUSPEND, the process go in HIBernate state (in one case for more than 3 days, this week-end) and I had to kill them. Just to explain you my configuration, I've registrated all the nodes in area 46 (my area), something like 1000 nodes, but the exporting is enable only for 20/30 nodes, to SUSPEND/RESUME I use the command: MCC> SUSPEND/RESUME export node4 * expo target XXXXXXXXX Let me know if you think to some other checks, otherwise I think to deregister all the nodes and register them again. Thanks, Ciao Luciano. |