Human Generated Data

Title

Untitled (two car accident, with people gathered around)

Date

c.1970, printed from 1954 negative

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.18478

Human Generated Data

Title

Untitled (two car accident, with people gathered around)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c.1970, printed from 1954 negative

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-25

Human 99.6
Person 99.6
Person 98.7
Transportation 98.6
Car 98.6
Automobile 98.6
Vehicle 98.6
Van 96
Machine 95
Wheel 95
Car 94.9
Person 94
Person 92.9
Person 89.6
Pedestrian 88.4
Apparel 86.9
Clothing 86.9
Person 85.8
Person 82.9
Shorts 82.3
Person 81.5
Person 80.7
Person 80
Person 78.9
Car 77.6
Spoke 74.2
Tire 68.9
Tarmac 67.8
Asphalt 67.8
Person 67.3
Alloy Wheel 65.8
Caravan 62.8
Road 62.4
People 61.8
Watercraft 58.9
Vessel 58.9
Bus 57.1

Imagga
created on 2022-02-25

garage 60
car 45
sidewalk 28.6
road 27.1
transportation 21.5
vehicle 21.3
automobile 18.2
travel 17.6
street 15.6
auto 15.3
drive 15.1
sea 14.1
motor vehicle 14.1
sky 14
water 12.7
city 12.5
boat 12.1
transport 11.9
sand 11.7
driving 11.6
vacation 11.5
swing 11.3
beach 11.1
cars 10.8
wheel 10.5
landscape 10.4
danger 10
trees 9.8
old 9.8
urban 9.6
day 9.4
ocean 9.1
machine 9
truck 8.9
ship 8.8
sport 8.7
scene 8.7
track 8.6
mechanical device 8.6
snow 8.4
wheeled vehicle 8.3
river 8
rural 7.9
outdoors 7.9
driver 7.8
tree 7.8
broken 7.7
engine 7.7
traffic 7.6
plaything 7.5
weather 7.5
bus 7.5
vintage 7.4
minivan 7.4
building 7.4
speed 7.3
island 7.3
people 7.3
sun 7.2
grass 7.1
work 7.1

Microsoft
created on 2022-02-25

outdoor 99
sky 98.2
vehicle 98.1
land vehicle 97.8
car 94.4
text 87.9
wheel 85.3

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 95.7%
Calm 53.6%
Disgusted 25%
Sad 7.4%
Happy 6.7%
Fear 2.7%
Surprised 2%
Angry 1.6%
Confused 1%

AWS Rekognition

Age 13-21
Gender Male, 93.4%
Calm 93.9%
Angry 2.2%
Sad 1.1%
Surprised 1%
Confused 0.6%
Disgusted 0.5%
Happy 0.3%
Fear 0.3%

AWS Rekognition

Age 20-28
Gender Male, 99.4%
Surprised 39.3%
Happy 21.7%
Confused 13.2%
Disgusted 13.2%
Angry 5.6%
Calm 3.5%
Sad 2.5%
Fear 1%

AWS Rekognition

Age 47-53
Gender Male, 99.6%
Calm 40.9%
Confused 18.1%
Disgusted 14.3%
Angry 12.6%
Sad 6.6%
Surprised 4.9%
Fear 1.4%
Happy 1.2%

AWS Rekognition

Age 21-29
Gender Male, 99%
Calm 92.3%
Surprised 3.2%
Sad 2.2%
Confused 0.5%
Fear 0.5%
Angry 0.5%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 27-37
Gender Female, 88.3%
Sad 37.9%
Disgusted 29.2%
Calm 9.7%
Surprised 8.8%
Angry 5.6%
Fear 4.7%
Confused 2.7%
Happy 1.6%

AWS Rekognition

Age 22-30
Gender Male, 99.6%
Calm 48.2%
Disgusted 35.4%
Happy 5.9%
Angry 4.4%
Sad 3.7%
Confused 1.1%
Surprised 1%
Fear 0.4%

AWS Rekognition

Age 29-39
Gender Male, 98.2%
Calm 89.7%
Sad 3%
Disgusted 1.8%
Confused 1.4%
Surprised 1.4%
Angry 1.2%
Fear 0.8%
Happy 0.7%

AWS Rekognition

Age 29-39
Gender Male, 99.8%
Calm 45.2%
Happy 17.9%
Angry 12%
Confused 9.7%
Sad 6.3%
Surprised 3.8%
Disgusted 3.7%
Fear 1.3%

AWS Rekognition

Age 24-34
Gender Male, 93.2%
Calm 43.1%
Sad 37.8%
Happy 16.3%
Fear 0.8%
Disgusted 0.5%
Surprised 0.5%
Confused 0.5%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.6%
Car 98.6%
Wheel 95%

Captions

Microsoft

a group of people standing outside of a building 88.5%
a group of people standing in front of a building 87.4%
a group of people standing next to a building 87.2%

Text analysis

Amazon

41
KOREA