Human Generated Data

Title

Untitled (women lined up in front of counter, seen from behind shop counter)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14372

Human Generated Data

Title

Untitled (women lined up in front of counter, seen from behind shop counter)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14372

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Person 98.7
Person 98.7
Person 95.4
Person 95.1
Person 93.4
Person 86.9
Person 86.1
Clothing 85.4
Apparel 85.4
Sitting 78.5
People 73.8
Restaurant 73.4
Face 71
Meal 67.4
Food 67.4
Female 65.4
Person 65.2
Portrait 64.6
Photography 64.6
Photo 64.6
Dish 60.3
Cafeteria 58.3
Girl 56.6
Indoors 55.8
Crowd 55.5

Clarifai
created on 2023-10-27

people 99.9
group 99.5
adult 98.9
man 98.4
woman 98.4
monochrome 96.8
leader 94.1
child 94
administration 93.7
sit 92
group together 91.3
several 90.9
transportation system 90.6
vehicle 88.9
many 87.8
four 87
wear 85.1
three 84.3
indoors 81.6
recreation 79.5

Imagga
created on 2022-01-29

salon 65.8
people 34.5
man 30.9
office 28.5
person 28.3
adult 27.2
computer 25.8
male 24.1
working 23.8
business 23.7
laptop 23.4
sitting 21.5
happy 21.3
businesswoman 19.1
technology 18.5
smiling 18.1
portrait 17.5
case 17.4
work 17.2
professional 17.1
looking 16.8
home 16.7
indoors 16.7
senior 15.9
table 15.9
businessman 15.9
desk 15.5
group 15.3
worker 14.2
education 13.8
lifestyle 13.7
women 13.4
together 13.1
smile 12.8
one 12.7
suit 12.6
room 12.3
meeting 12.2
couple 12.2
executive 12.1
teamwork 12
men 12
indoor 11.9
communication 11.7
job 11.5
businesspeople 11.4
face 11.4
modern 11.2
corporate 11.2
mature 11.1
happiness 11
television 10.7
pretty 10.5
casual 10.2
team 9.8
attractive 9.8
doctor 9.4
student 9.3
confident 9.1
human 9
lady 8.9
success 8.8
medical 8.8
equipment 8.6
husband 8.6
notebook 8.5
expression 8.5
finance 8.4
monitor 8.4
manager 8.4
classroom 8.4
house 8.3
successful 8.2
care 8.2
hair 7.9
window 7.9
black 7.8
iron lung 7.8
old 7.7
elderly 7.7
employee 7.6
serious 7.6
two 7.6
relax 7.6
glasses 7.4
coat 7.1
handsome 7.1
family 7.1
interior 7.1

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

person 98.1
text 91.8
indoor 88.8
window 87.6
black and white 83.7
human face 65.4
street 63.1
sushi 58.5
dish 56.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 98.2%
Happy 57.9%
Surprised 27.4%
Sad 6.3%
Fear 2.1%
Disgusted 1.9%
Calm 1.9%
Angry 1.5%
Confused 1.1%

AWS Rekognition

Age 41-49
Gender Male, 79.4%
Happy 75.2%
Surprised 10.9%
Sad 6.1%
Calm 3.9%
Fear 1.4%
Angry 1.1%
Disgusted 0.8%
Confused 0.5%

AWS Rekognition

Age 35-43
Gender Male, 90.8%
Sad 64.5%
Confused 12.2%
Happy 9.7%
Calm 7.7%
Surprised 3.2%
Disgusted 1.4%
Fear 0.7%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.4%
Person 98.7%
Person 98.7%
Person 95.4%
Person 95.1%
Person 93.4%
Person 86.9%
Person 86.1%
Person 65.2%

Categories

Text analysis

Amazon

ANGELO
LONGARDO
HERLA

Google

ANGALD LOMGAROO
ANGALD
LOMGAROO