Human Generated Data

Title

Don’s 1953 Ford

Date

1967

People

Artist: Danny Lyon, American born 1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Susan and Neal Yanofsky, 2014.498

Copyright

© Danny Lyon/Magnum Photos

Human Generated Data

Title

Don’s 1953 Ford

People

Artist: Danny Lyon, American born 1942

Date

1967

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Person 99.7
Human 99.7
Person 99.5
Person 99.5
Person 99
Automobile 96.5
Car 96.5
Transportation 96.5
Vehicle 96.5
Person 95.1
Convertible 87.7
Antique Car 87.3
Model T 87.3
Sitting 77.9
Amusement Park 62.3
Theme Park 62.3
Face 59.2
People 58
Hot Rod 55.5

Clarifai
created on 2018-02-09

vehicle 99.9
people 99.7
group together 99.2
group 98.8
transportation system 98.3
adult 98.1
convertible 97.7
man 95.8
several 95.2
military 95
war 93.8
car 93.1
administration 92.9
many 92.9
woman 90.3
aircraft 90
soldier 90
three 89.6
two 89.5
four 88.3

Imagga
created on 2018-02-09

man 32.2
laptop 27.6
people 27.3
car 26.7
male 26.2
business 24.9
adult 24
person 22.1
computer 21.7
office 20.5
sitting 19.7
happy 19.4
businessman 19.4
professional 18.7
working 18.5
windshield 17.4
work 17.3
corporate 17.2
men 15.4
screen 15.1
vehicle 14.1
smile 13.5
technology 13.4
worker 13.3
smiling 13
success 12.9
seat 12.6
executive 12.4
notebook 12.3
lifestyle 12.3
support 12.1
outdoors 12.1
women 11.9
businesswoman 11.8
together 11.4
looking 11.2
mature 11.2
communication 10.9
handsome 10.7
television 10.6
job 10.6
senior 10.3
happiness 10.2
casual 10.2
horizontal 10
transportation 9.9
team 9.8
cheerful 9.7
indoors 9.7
couple 9.6
automobile 9.6
businesspeople 9.5
manager 9.3
confident 9.1
suit 9
device 8.9
group 8.9
day 8.6
friends 8.4
black 8.4
portrait 8.4
glasses 8.3
world 8.3
successful 8.2
alone 8.2
student 8.1
billboard 8
chair 8
travel 7.7
driving 7.7
concentration 7.7
modern 7.7
pretty 7.7
attractive 7.7
old 7.7
outdoor 7.6
two 7.6
career 7.6
drive 7.6
desk 7.6
truck 7.5
teamwork 7.4
bench 7.3
indoor 7.3
covering 7.2
rumble 7.2

Microsoft
created on 2018-02-09

outdoor 99.7
tree 99.1
person 94.4
car 21.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Male, 91.7%
Angry 6.5%
Happy 0.8%
Confused 3.7%
Sad 13.7%
Surprised 2%
Calm 69%
Disgusted 4.3%

AWS Rekognition

Age 20-38
Gender Male, 74%
Disgusted 0.6%
Surprised 0.7%
Angry 3.3%
Calm 92.1%
Happy 0.3%
Sad 1.7%
Confused 1.3%

AWS Rekognition

Age 14-23
Gender Female, 72.6%
Angry 12.8%
Calm 55.3%
Sad 13%
Confused 4.9%
Disgusted 10.8%
Happy 1.8%
Surprised 1.4%

Microsoft Cognitive Services

Age 20
Gender Male

Microsoft Cognitive Services

Age 33
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people sitting around a car 90%
a group of people sitting in a car 88.3%
a group of people sitting in front of a car 87.9%