Human Generated Data

Title

Untitled (man sitting in tiny racing car)

Date

1960

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16035.1

Human Generated Data

Title

Untitled (man sitting in tiny racing car)

People

Artist: Jack Gould, American

Date

1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Human 99.8
Person 99.8
Person 99.6
Person 99.5
Person 99.4
Person 99.4
Person 98.5
Person 98
Person 93.8
Car 93.1
Transportation 93.1
Automobile 93.1
Vehicle 93.1
Sports Car 89.3
Footwear 85.2
Clothing 85.2
Shoe 85.2
Apparel 85.2
Machine 80
Wheel 80
Race Car 79.4
Kart 76.1
Tire 69.2
Person 62.9
People 61
Asphalt 60.5
Tarmac 60.5
Formula One 58.9
Advertisement 58.1
Shoe 54.3

Imagga
created on 2022-02-11

person 29.1
man 28.9
people 26.8
adult 25.9
male 25.1
chair 24.1
happy 23.8
wheelchair 23.1
sitting 22.3
smile 22.1
child 20.8
happiness 18.8
couple 18.3
lifestyle 18.1
seat 17.1
smiling 16.6
cheerful 16.2
men 15.5
women 15
fun 15
boy 13.9
kin 13.5
furniture 13.4
home 12.8
musical instrument 12.6
together 12.3
active 12.3
outdoor 12.2
outdoors 12.1
teen 11.9
attractive 11.9
casual 11.9
love 11.8
portrait 11.6
leisure 11.6
park 11.6
tricycle 11.5
group 11.3
youth 11.1
wheeled vehicle 10.9
cute 10.8
looking 10.4
education 10.4
mother 10.4
resort area 10.3
teenager 10
girls 10
joy 10
area 10
handsome 9.8
pretty 9.8
family 9.8
kid 9.7
sit 9.5
wall 9.4
friendship 9.4
study 9.3
stringed instrument 9.3
relax 9.3
house 9.2
student 9.2
laptop 9.1
activity 9
professional 8.9
work 8.9
indoors 8.8
school 8.5
life 8.5
floor 8.4
vehicle 8
interior 8
day 7.8
color 7.8
sport 7.7
health 7.6
two 7.6
room 7.6
fashion 7.5
relaxation 7.5
friends 7.5
one 7.5
book 7.3
playing 7.3
business 7.3
relaxing 7.3
parent 7.2
performer 7.2
bench 7.1

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

person 98.8
wheel 90.7
clothing 87.9
vehicle 78.5
text 74.5
land vehicle 72.7
footwear 67.1
tire 52.8

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 28-38
Gender Male, 97.8%
Calm 97.2%
Happy 1.4%
Sad 0.9%
Confused 0.2%
Angry 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 99.9%
Happy 99.7%
Calm 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0%
Confused 0%
Sad 0%
Disgusted 0%

AWS Rekognition

Age 30-40
Gender Male, 100%
Happy 87.5%
Disgusted 6.7%
Confused 1.5%
Calm 1.4%
Surprised 1%
Angry 0.7%
Sad 0.7%
Fear 0.6%

AWS Rekognition

Age 34-42
Gender Male, 59.1%
Calm 99.5%
Happy 0.3%
Sad 0.1%
Fear 0%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 40
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 85.2%
Wheel 80%

Captions

Microsoft

a group of people riding on the back of a pickup truck 54%
a group of people riding on the back of a vehicle 53.9%
a group of people sitting in a chair 53.8%

Text analysis

Amazon

Smith
Ca
lay Smith Ca
lay
MJI+
NAGON
HORSEPONE
И n. HORSEPONE
И n.

Google

YT37
RODVK
Ae
Smilh RODVK YT37 Ae
Smilh