Skip to content
Snippets Groups Projects
Commit efce8eff authored by Per Cederqvist's avatar Per Cederqvist
Browse files

LYSrdiff now randomizes the backup order a bit less.

* distribute-tasks (ordered_tasks): New global variable.
  (read_tasks): Store the job in it as well.
  (read_new_tasks): Ditto.
  (write_task_lists): Retain roughly the old order, but move a few
    random jobs to the front of the job queue.
parent 39881334
No related branches found
No related tags found
No related merge requests found
2006-10-24 Per Cederqvist <ceder@sedesopim.lysator.liu.se>
LYSrdiff now randomizes the backup order a bit less.
* distribute-tasks (ordered_tasks): New global variable.
(read_tasks): Store the job in it as well.
(read_new_tasks): Ditto.
(write_task_lists): Retain roughly the old order, but move a few
random jobs to the front of the job queue.
2006-10-23 Per Cederqvist <ceder@lysator.liu.se> 2006-10-23 Per Cederqvist <ceder@lysator.liu.se>
LYSrdiff can now backup up to separate partitions, creates more LYSrdiff can now backup up to separate partitions, creates more
......
...@@ -52,6 +52,9 @@ def newtasks(): ...@@ -52,6 +52,9 @@ def newtasks():
# value: JobInfo # value: JobInfo
tasks_per_source = {} tasks_per_source = {}
# value: JobInfo
ordered_tasks = []
fatal = False fatal = False
def tasklist_file(lysrdiffpart): def tasklist_file(lysrdiffpart):
...@@ -75,6 +78,7 @@ def read_tasks(lysrdiffpart): ...@@ -75,6 +78,7 @@ def read_tasks(lysrdiffpart):
fatal = True fatal = True
tasks_per_source[info.source()] = info tasks_per_source[info.source()] = info
ordered_tasks.append(info)
def read_new_tasks(): def read_new_tasks():
new_found = False new_found = False
...@@ -83,13 +87,23 @@ def read_new_tasks(): ...@@ -83,13 +87,23 @@ def read_new_tasks():
if (info.host(), info.directory()) not in tasks_per_source: if (info.host(), info.directory()) not in tasks_per_source:
info.set_lysrdiffpart(newtasks()) info.set_lysrdiffpart(newtasks())
tasks_per_source[(info.host(), info.directory())] = info tasks_per_source[(info.host(), info.directory())] = info
ordered_tasks.append(info)
new_found = True new_found = True
return new_found return new_found
def write_task_lists(): def write_task_lists():
jobs = tasks_per_source.values() jobs = ordered_tasks[:]
random.shuffle(jobs)
# Pick a few lucky jobs and move them to the front of the queue.
# This way, we get roughly the same order as on the previous
# backup (which is good because each job will then be backuped up
# with approximately the same interval) but no job is (on average)
# favoured over any other job.
for x in range(1 + int(0.005 * len(jobs))):
lucky_ix = random.randrange(0, len(jobs))
jobs = [jobs[lucky_ix]] + jobs[:lucky_ix] + jobs[lucky_ix+1:]
files = {} files = {}
for job in jobs: for job in jobs:
if job.lysrdiffpart() not in files: if job.lysrdiffpart() not in files:
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment