myexperiment-hackers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[myexperiment-hackers] [2031] branches/event_logging: News generation.


From: noreply
Subject: [myexperiment-hackers] [2031] branches/event_logging: News generation.
Date: Thu, 4 Dec 2008 11:28:29 -0500 (EST)

Revision
2031
Author
alekses6
Date
2008-12-04 11:28:27 -0500 (Thu, 04 Dec 2008)

Log Message

News generation. Work on news entry grouping.

1) improved sorting of events - now sorted by date (reverse) AND id of the record withing activity_logs table (more accurate order achieved in the feeds)

2) bug-fix - profile update event in the "public news" on home page was causing crashes

3) initial grouping of taggings (news entry would show multiple tags attached to an item, not many news entries as before); profile updates (only the latest of all will be shown).

NB! the code still contains a lot of debug output - this will be removed in one of the later commits.

Modified Paths

Diff

Modified: branches/event_logging/app/helpers/application_helper.rb (2030 => 2031)


--- branches/event_logging/app/helpers/application_helper.rb	2008-12-03 14:55:06 UTC (rev 2030)
+++ branches/event_logging/app/helpers/application_helper.rb	2008-12-04 16:28:27 UTC (rev 2031)
@@ -1736,10 +1736,130 @@
     # remove any duplicates (which may arise when getting same event log entry from friends' related events),  
     # then sort by date descending; then delete a single (only possible after "uniq") empty element
     events_unique = events.uniq.sort { |a, b|
-      b.created_at <=> a.created_at
+      [b.created_at, b.id] <=> [a.created_at, a.id]
     }
     events_unique.delete([])
     
+    ################ GROUPING ###############
+    # going backwards; events with smaller indices in the array are later in time
+    
+    # DEBUG
+    puts "=============== GROUPING ================="
+    # END DEBUG
+    
+    related_activity_loggable_ids = []
+    array_ids_to_delete = []
+    grouping_happened = false
+    
+    seq_timestamp = nil
+    seq_start_idx = events_unique.length - 1 # the last element
+    base_event = events_unique[seq_start_idx]
+    
+    i = events_unique.length - 2 # one before the last one
+    
+    while (i >= 0) do
+      # DEBUG
+      puts "\ni = #{i}"
+      # END DEBUG
+      
+      
+      # initialization for new loop iteration
+      new_event = events_unique[i]
+      reset_required = true
+      
+      # check if grouping will happen
+      if (new_event.created_at - base_event.created_at <= EVENT_GROUPING_TIMEFRAME)
+        # DEBUG
+        puts "timeframe match: seq_start(#{seq_start_idx}) -> current(#{i})"
+        puts "new: " + new_event.action + " base: " + base_event.action + " #{ new_event.action == base_event.action}"
+        puts "new: " + new_event.activity_loggable_type + " base: " + base_event.activity_loggable_type + " #{ new_event.activity_loggable_type == base_event.activity_loggable_type}"
+        puts "new: " + new_event.culprit_type + " base: " + base_event.culprit_type  + " #{ new_event.culprit_type == base_event.culprit_type}" unless new_event.culprit_type.nil? || base_event.culprit_type.nil?
+        puts "new: #{new_event.culprit_id} base: #{base_event.culprit_id} #{new_event.culprit_id == base_event.culprit_id}" unless new_event.culprit_id.nil? || base_event.culprit_id.nil?
+        puts "new: " + new_event.referenced_type + " base: " + base_event.referenced_type  + " #{ new_event.referenced_type == base_event.referenced_type}" unless new_event.referenced_type.nil? || base_event.referenced_type.nil?
+        puts "new: #{new_event.referenced_id} base: #{base_event.referenced_id} #{new_event.referenced_id == base_event.referenced_id}" unless new_event.referenced_id.nil? || base_event.referenced_id.nil?
+        
+        puts "-- first condition of further check: #{["Tagging"].include? new_event.activity_loggable_type}"
+        puts "-- second condition of further check: #{new_event.action == "create"}"
+        puts "\n"
+        
+        # END DEBUG
+        
+        # events are within the corerct timeframe, potentially can be grouped
+        if ((["Tagging"].include? new_event.activity_loggable_type) && new_event.action == "create" &&
+            new_event.action == base_event.action && new_event.activity_loggable_type == base_event.activity_loggable_type &&
+            new_event.culprit_type == base_event.culprit_type && new_event.culprit_id == base_event.culprit_id &&
+            new_event.referenced_type == base_event.referenced_type && new_event.referenced_id == base_event.referenced_id)
+          
+           # DEBUG
+           puts "grouping should happen"
+           # END DEBUG
+           
+           seq_timestamp = [new_event.created_at, new_event.updated_at]
+           related_activity_loggable_ids << new_event.activity_loggable_id
+           array_ids_to_delete << i
+           
+           grouping_happened = true
+           reset_required = false
+        elsif ((["Profile"].include? new_event.activity_loggable_type) && new_event.action == "update" &&
+               new_event.action == base_event.action && new_event.activity_loggable_type == base_event.activity_loggable_type &&
+               new_event.culprit_type == base_event.culprit_type && new_event.culprit_id == base_event.culprit_id &&
+               new_event.referenced_type == base_event.referenced_type && new_event.referenced_id == base_event.referenced_id)
+           
+           seq_timestamp = [new_event.created_at, new_event.updated_at]
+           array_ids_to_delete << i
+           
+           grouping_happened = true
+           reset_required = false
+        end
+      end
+      
+      if reset_required
+        # current sequence has ended (or never started) - current element is "different" enough, so starts the new sequence
+        # (no processing for current element is required at this iteration - only finalizing actions with the old sequence)
+        if grouping_happened
+          # DEBUG
+          puts "groping happened, seq finished"
+          # END DEBUG
+          
+          events_unique[seq_start_idx].created_at = seq_timestamp[0]
+          events_unique[seq_start_idx].updated_at = seq_timestamp[1]
+          events_unique[seq_start_idx] = [events_unique[seq_start_idx], related_activity_loggable_ids]
+          # delete all (now) redundant elements
+          array_ids_to_delete.each do |del_idx|
+            events_unique.delete_at(del_idx)
+          end
+        end
+        
+        # reset status variables
+        seq_start_idx = i # current element becomes the start of the sequence
+        base_event = events_unique[seq_start_idx]
+        seq_timestamp = nil
+        related_activity_loggable_ids = []
+        array_ids_to_delete = []
+        grouping_happened = false
+      end
+      
+      i -= 1
+    end
+    
+    if grouping_happened
+      # the zeroth element in array (last one in the loop) got grouped
+      
+      # DEBUG
+      puts "groping happened at last element, seq finished"
+      # END DEBUG
+      
+      events_unique[seq_start_idx].created_at = seq_timestamp[0]
+      events_unique[seq_start_idx].updated_at = seq_timestamp[1]
+      events_unique[seq_start_idx] = [events_unique[seq_start_idx], related_activity_loggable_ids]
+      # delete all (now) redundant elements
+      array_ids_to_delete.each do |del_idx|
+        events_unique.delete_at(del_idx)
+      end
+    end
+    
+    ################
+    
     # produce news from event list;
     # if enough events are available in the "events" array, this method will
     # always produce "limit" number of news entries - considering authorization and possible
@@ -1785,7 +1905,17 @@
   # NIL is returned when the user viewing the news is not authorized to see
   # current news entry OR entry in the event log is no longer valid because of
   # some of the referenced objects missing (perhaps, they were deleted over time)  
-  def news_entry_from_log_entry!(log_entry, current_viewer, contributor, contributor_news_only)
+  def news_entry_from_log_entry!(log_entry_container, current_viewer, contributor, contributor_news_only)
+    if log_entry_container.class.name == "Array"
+      # if log_entry_container holds an array, the zeroth element is the actual log_entry object!
+      log_entry = log_entry_container[0]
+      extra_ids = log_entry_container[1]
+    else
+      # the log entry container should be the log_entry itself
+      log_entry = log_entry_container
+      extra_ids = []
+    end
+    
     rtn = [] # despite this, NIL will be returned on errors / when news entry not to be shown for current user
     action = ""
     loggable_type = log_entry.activity_loggable_type
@@ -2278,8 +2408,28 @@
       when "Tagging"
         if action == "create"
           begin
-            tagging = Tagging.find(log_entry.activity_loggable_id)
-            tag = Tag.find(tagging.tag_id)
+            # process potentially multiple tags
+            all_tagging_ids = [log_entry.activity_loggable_id]
+            all_tagging_ids.concat(extra_ids)
+            not_found_tag_count = 0
+            tag_strings = []
+            
+            all_tagging_ids.each do |tagging_id|
+              # protected block required to ensure that if several - not all tags are missing, remaining still get displayed
+              begin
+                tagging = Tagging.find(tagging_id)
+                tag = Tag.find(tagging.tag_id)
+                tag_strings << ["\"" + link_to(tag.name, tag_path(tag.id)) + "\""]
+              rescue ActiveRecord::RecordNotFound
+                not_found_tag_count += 1
+              end
+            end
+            
+            if not_found_tag_count == all_tagging_ids.length
+              raise ActiveRecord::RecordNotFound, "None of the tags found"
+            end
+            
+            
             object, object_path = evaluate_object_instance_and_path(log_entry.referenced_type, log_entry.referenced_id)
             object_visible_name = contributable_name_from_instance(object)
             
@@ -2287,7 +2437,7 @@
             authorized = ( my_event || object.authorized?("view", current_viewer) )
             
             if authorized
-              rtn << [timestamp, "#{culprit_link} <span class='news_feed_action'>tagged</span> #{link_to object_visible_name, object_path} #{model_visible_name(log_entry.referenced_type.to_s, true, object)} with \"#{link_to tag.name, tag_path(tag.id)}\".", "Tags"]
+              rtn << [timestamp, "#{culprit_link} <span class='news_feed_action'>tagged</span> #{link_to object_visible_name, object_path} #{model_visible_name(log_entry.referenced_type.to_s, true, object)} with #{tag_strings.join(", ")}.", "Tags"]
             end
           rescue ActiveRecord::RecordNotFound
             # do nothing, but don't display the news entry for missing tagging / tag / object
@@ -2327,7 +2477,7 @@
           # as these are created / deleted along with the user account
           when "update"
             # only friends of the user will see this event
-            if my_event || (logged_in? && current_viewer.friend?(log_entry.culprit_id))
+            if my_event || (logged_in? && !current_viewer.nil? && (current_viewer != 0) && current_viewer.friend?(log_entry.culprit_id))
               rtn << [timestamp, "#{culprit_link} has <span class='news_feed_action'>updated</span> their #{link_to "profile", user_path(log_entry.culprit_id)}.", "User Profile Updates"]
             end
         end

Modified: branches/event_logging/config/environment_private.rb.pre (2030 => 2031)


--- branches/event_logging/config/environment_private.rb.pre	2008-12-03 14:55:06 UTC (rev 2030)
+++ branches/event_logging/config/environment_private.rb.pre	2008-12-04 16:28:27 UTC (rev 2031)
@@ -90,6 +90,9 @@
 # of news feeds (in SECONDS)
 NEWS_CACHE_TIMEOUT = 300
 
+# similar events within this timeframe will be grouped together, where possible
+EVENT_GROUPING_TIMEFRAME = 5.minutes
+
 # Default timeframes for various types of news
 # [this means that only events after (Time.now - DEFAULT_<>_TIMEFRAME) will be fetched from event log]
 DEFAULT_NEWS_TIMEFRAME = 1.week

reply via email to

[Prev in Thread] Current Thread [Next in Thread]