Human Generated Data

Title

Untitled (many people seated inside city bus)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14774

Human Generated Data

Title

Untitled (many people seated inside city bus)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Human 99
Person 99
Person 98.9
Person 98.5
Person 97.5
Person 94.6
Person 94.3
Transportation 84.4
Vehicle 84
Person 82.9
Steamer 78.8
Apparel 76.2
Clothing 76.2
Person 71.5
Person 68.8
Train 62.2
Indoors 57.8
People 56.8
Person 46.1
Person 44.7

Imagga
created on 2022-01-29

people 24
business 23.7
interior 23
shop 22.9
modern 22.4
supermarket 20.5
architecture 20.5
work 20.5
man 18.1
clothing 17.8
adult 17.6
building 17.4
mercantile establishment 16.7
office 16.3
urban 15.7
men 15.4
equipment 15.2
person 14.9
indoors 14.9
women 14.2
city 14.1
glass 14.1
inside 13.8
grocery store 13.5
professional 12.9
casual 12.7
mall 12.6
technology 12.6
window 12.4
working 12.4
medical 12.3
male 12
restaurant 11.9
hall 11.6
businessman 11.5
design 11.2
shopping 11
lifestyle 10.8
worker 10.8
team 10.7
hospital 10.6
wagon 10.5
room 10.5
boutique 10.5
doctor 10.3
corporate 10.3
place of business 10.1
marketplace 10.1
occupation 10.1
health 9.7
medicine 9.7
group 9.7
life 9.5
table 9.5
walking 9.5
indoor 9.1
fashion 9
brassiere 9
clinic 9
nurse 9
science 8.9
patient 8.6
customer 8.6
buy 8.4
wheeled vehicle 8.3
human 8.2
metal 8
salon 8
light 8
looking 8
job 8
barbershop 7.9
drawing 7.9
education 7.8
lab 7.8
chemical 7.7
laboratory 7.7
construction 7.7
research 7.6
retail 7.6
communication 7.5
store 7.5
house 7.5
teamwork 7.4
businesswoman 7.3
center 7.2
surgeon 7.2
student 7.2
woman's clothing 7.2
undergarment 7.2
garment 7.2
chair 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 97.8
person 92
clothing 87.6
black and white 74
people 68.2
group 65.4
man 63.1
woman 62.7
clothes 33.6
store 31.6

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 88.9%
Surprised 70.5%
Happy 18.8%
Calm 9%
Disgusted 0.5%
Confused 0.5%
Angry 0.4%
Sad 0.3%
Fear 0.1%

AWS Rekognition

Age 31-41
Gender Female, 95.2%
Calm 62%
Disgusted 15%
Fear 13.2%
Happy 2.5%
Surprised 2.1%
Sad 1.8%
Angry 1.7%
Confused 1.6%

AWS Rekognition

Age 43-51
Gender Male, 56.9%
Calm 80%
Sad 11.1%
Happy 2.5%
Surprised 2.4%
Confused 1.7%
Angry 1.2%
Disgusted 0.7%
Fear 0.4%

AWS Rekognition

Age 36-44
Gender Male, 99.7%
Disgusted 39.7%
Sad 21.6%
Angry 15.8%
Fear 11.7%
Happy 4%
Calm 3.7%
Surprised 1.7%
Confused 1.6%

AWS Rekognition

Age 43-51
Gender Male, 79.1%
Calm 44%
Happy 34.9%
Confused 11%
Sad 6%
Fear 1.5%
Disgusted 1.1%
Angry 0.9%
Surprised 0.7%

AWS Rekognition

Age 16-24
Gender Female, 96.7%
Calm 95.5%
Sad 2.1%
Surprised 1.2%
Confused 0.4%
Disgusted 0.4%
Fear 0.2%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 33-41
Gender Male, 66.3%
Sad 83.1%
Fear 8.9%
Calm 4.4%
Happy 1.3%
Confused 0.9%
Angry 0.8%
Disgusted 0.4%
Surprised 0.1%

AWS Rekognition

Age 26-36
Gender Male, 91.4%
Calm 56.2%
Sad 25.1%
Happy 5.1%
Confused 4.9%
Angry 4.3%
Fear 1.6%
Disgusted 1.6%
Surprised 1.2%

AWS Rekognition

Age 22-30
Gender Male, 86.5%
Surprised 55.2%
Calm 41.2%
Angry 0.9%
Disgusted 0.9%
Happy 0.9%
Fear 0.4%
Confused 0.3%
Sad 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%
Train 62.2%

Captions

Microsoft

a group of people in a store 88.6%
a group of people standing in front of a store 88.4%
a group of people standing in a store 87.3%

Text analysis

Amazon

DRUG STORE
BLDG
and BLDG DRUG STORE
-
NAGGA
1
D
un
and