To the Top
File:  root - text - article - 2019 - 11 - intersection-of-linked-lists.txt
Tags: 每日算法题, 算法, 数据结构, 面试题, Daily Interview Problem, Data Structures and Algorithms, Computer Programming, | English | Home Page | Category: Computing | 225 Views, 17929 Search Bots | 136 Words

Subscribe to Feed Burner | Browse | Archive
Hi, here's your problem today. This problem was recently asked by Apple:

You are given two singly linked lists. The lists intersect at some node. Find, and return the node. Note: the lists are non-cyclical.

Example:


A = 1 -> 2 -> 3 -> 4
B = 6 -> 3 -> 4


This should return 3 (you may assume that any nodes with the same value are the same node).

Here is a starting point:


def intersection(a, b):
# fill this in.

class Node(object):
def __init__(self, val):
self.val = val
self.next = None
def prettyPrint(self):
c = self
while c:
print c.val,
c = c.next

a = Node(1)
a.next = Node(2)
a.next.next = Node(3)
a.next.next.next = Node(4)

b = Node(6)
b.next = a.next.next

c = intersection(a, b)
c.prettyPrint()
# 3 4
Tags: 每日算法题, 算法, 数据结构, 面试题, Daily Interview Problem, Data Structures and Algorithms, Computer Programming, | English | Home Page | Cateogry: Computing | 225 Views, 17929 Search Bots | 136 Words Subscribe to Feed Burner

Related Articles

  1. Daily Interview Problem: Look and Say Sequence
  2. Daily Interview Problem: Distribute Bonuses
  3. Algorithm Interview: String Compression
  4. Find the non-duplicate number
  5. Detect Linked List Cycle
  6. Daily Interview Question: Longest Sequence with Two Unique Numbers
  7. PHP Unit Tests on VPS Server
  8. Skip the readings, focus on problems. And use all the hints!
  9. Number of Ways to Climb Stairs
  10. [Daily Problem] Remove Consecutive Nodes that Sum to 0

Comments (0)

Your Email (Domain Part Not Exposed):

Your Comments:

Privately By Mail Colors More Smileys S x y @

Verification (Click Image 2 Refresh):

    Be the first one to comment this page !


Page Edited: May 11 2024 14:36:49 | RSS Subscription
How to Cook a Perfect Steak? | <meta name="robots" content="noindex, follow" />