Human Generated Data

Title

Untitled (Old Home Day)

Date

1956, printed later

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.393

Human Generated Data

Title

Untitled (Old Home Day)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

1956, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.393

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.3
Human 99.3
Person 99
Person 98.5
Person 94.7
Wheel 81
Machine 81
Person 77
Wheel 75
Person 71.3
Person 62
Clothing 61.4
Apparel 61.4
Stroller 59.7
Wheel 53.8
Person 41.6

Clarifai
created on 2023-10-25

people 99.9
child 98.3
vehicle 98.2
monochrome 98.2
group together 98.1
transportation system 97.6
adult 96.8
cart 94.9
woman 94
two 93.9
man 93.8
group 93.7
several 93.3
street 91.6
wear 91.1
many 90.5
boy 90
recreation 89.5
three 89.3
four 89.2

Imagga
created on 2022-01-08

pedestrian 72.7
barrow 48.6
wheeled vehicle 41.1
handcart 40.3
vehicle 33.1
man 20.2
old 17.4
conveyance 16.9
people 15.6
outdoors 14.9
tricycle 13.8
male 12.8
outdoor 12.2
work 11
sky 10.8
grass 10.3
wheelchair 10.2
road 9.9
care 9.9
transportation 9.9
park 9.1
adult 9.1
chair 8.9
person 8.8
men 8.6
wheel 8.5
field 8.4
machine 8.1
sunset 8.1
tool 8.1
mountain 8
cart 8
travel 7.7
horse 7.6
beach 7.6
dark 7.5
city 7.5
landscape 7.4
back 7.3
active 7.2
summer 7.1
day 7.1
architecture 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99.8
tree 99.3
sky 99.2
text 96.4
person 86.1
clothing 81.7
man 75.4
vehicle 71.5
old 70.5
land vehicle 64.3
vintage 26.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-14
Gender Male, 96.8%
Sad 77.3%
Confused 14%
Disgusted 2.7%
Happy 2.5%
Calm 1.2%
Angry 1.2%
Fear 0.7%
Surprised 0.4%

AWS Rekognition

Age 23-33
Gender Female, 99.7%
Happy 98.5%
Calm 0.4%
Angry 0.3%
Sad 0.2%
Surprised 0.2%
Fear 0.2%
Disgusted 0.1%
Confused 0.1%

AWS Rekognition

Age 6-12
Gender Female, 51.6%
Angry 62.2%
Disgusted 28.5%
Sad 4.7%
Confused 2.6%
Calm 0.8%
Surprised 0.4%
Happy 0.4%
Fear 0.3%

AWS Rekognition

Age 6-14
Gender Female, 67.1%
Sad 100%
Fear 0%
Angry 0%
Confused 0%
Calm 0%
Disgusted 0%
Happy 0%
Surprised 0%

Microsoft Cognitive Services

Age 21
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Wheel 81%

Categories

Imagga

paintings art 49.9%
people portraits 47.4%

Text analysis

Amazon

1756
OR
BUST
Stager
SYNDLANK

Google

OR AUST 1756 Stagar
OR
AUST
1756
Stagar