Human Generated Data

Title

Untitled (crowded schools)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16060.1

Human Generated Data

Title

Untitled (crowded schools)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-11

Human 99.2
Person 99.2
Person 99.2
Person 99
Person 99
Person 97.7
Person 97.7
Person 96.5
Person 95.9
Clinic 95.6
Person 93.4
Indoors 89.3
Interior Design 89.3
Room 88.2
Person 77.3
Hospital 73.9
Face 69.2
Furniture 69.2
Sitting 67.3
Person 66.7
People 62.2
Living Room 59.6
Flooring 58.2
Operating Theatre 58.1
Doctor 57.5
Screen 56.8
Display 56.8
Monitor 56.8
Electronics 56.8
Shoe 55
Apparel 55
Clothing 55
Footwear 55

Imagga
created on 2022-02-11

room 64.8
classroom 54.5
interior 38.9
modern 26.6
table 24.4
people 23.4
chair 23.3
indoors 22.8
window 21.9
business 21.2
restaurant 20.3
home 19.9
person 19.8
furniture 19.6
man 19.5
office 19.3
house 19.2
indoor 19.2
shop 18.6
inside 18.4
design 16.9
building 16.2
cafeteria 16.1
businessman 15.9
wood 15
work 15
lifestyle 14.4
working 14.1
kitchen 13.6
light 12.7
women 12.6
decor 12.4
contemporary 12.2
group 12.1
floor 12.1
computer 12
men 12
communication 11.7
hospital 11.7
male 11.3
structure 11.2
architecture 10.9
glass 10.9
barbershop 10.7
dining 10.5
meeting 10.4
corporate 10.3
adult 10.2
job 9.7
desk 9.7
comfortable 9.5
executive 9.2
patient 9.2
occupation 9.2
mercantile establishment 9
life 9
chairs 8.8
case 8.8
urban 8.7
gesture 8.6
wall 8.5
space 8.5
living 8.5
3d 8.5
worker 8.3
plant 8.2
team 8.1
decoration 8
smiling 8
food 7.9
professional 7.8
empty 7.7
corporation 7.7
luxury 7.7
hall 7.7
apartment 7.7
tile 7.6
buy 7.5
city 7.5
technology 7.4
style 7.4
teamwork 7.4
television 7.4
seat 7.2
together 7

Google
created on 2022-02-11

Microsoft
created on 2022-02-11

person 96.4
text 95.8
clothing 93.4
black and white 68.5
man 64.5

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 74.4%
Sad 71%
Angry 13.5%
Calm 4.6%
Disgusted 3.9%
Confused 3.4%
Happy 1.8%
Fear 0.9%
Surprised 0.9%

AWS Rekognition

Age 33-41
Gender Male, 89.7%
Calm 98.9%
Sad 0.2%
Happy 0.2%
Angry 0.2%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 29-39
Gender Male, 99.2%
Happy 84.6%
Calm 9%
Sad 1.8%
Disgusted 1%
Confused 1%
Fear 0.9%
Angry 0.9%
Surprised 0.6%

AWS Rekognition

Age 18-26
Gender Male, 99.7%
Calm 94.5%
Sad 2.6%
Confused 1.1%
Surprised 0.5%
Angry 0.4%
Disgusted 0.4%
Happy 0.3%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Female, 61.6%
Calm 91%
Sad 4.6%
Happy 2.4%
Confused 0.7%
Fear 0.5%
Surprised 0.4%
Disgusted 0.3%
Angry 0.3%

AWS Rekognition

Age 43-51
Gender Male, 84.2%
Calm 82.7%
Sad 6.4%
Happy 3.9%
Surprised 3.8%
Confused 1.7%
Disgusted 0.7%
Angry 0.7%
Fear 0.2%

AWS Rekognition

Age 20-28
Gender Male, 91%
Sad 99.3%
Happy 0.4%
Fear 0.1%
Confused 0.1%
Disgusted 0%
Calm 0%
Angry 0%
Surprised 0%

AWS Rekognition

Age 23-33
Gender Male, 98.1%
Happy 63.8%
Calm 22.5%
Fear 5.7%
Sad 2.4%
Disgusted 2.2%
Angry 1.2%
Surprised 1.2%
Confused 1%

AWS Rekognition

Age 38-46
Gender Male, 67.3%
Surprised 67.5%
Calm 12.3%
Happy 11.6%
Confused 3.8%
Sad 1.5%
Disgusted 1.3%
Angry 1.3%
Fear 0.8%

AWS Rekognition

Age 18-26
Gender Female, 83.6%
Calm 69.6%
Sad 16.7%
Angry 4.9%
Confused 2.8%
Disgusted 2.2%
Happy 1.9%
Surprised 1%
Fear 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.2%
Shoe 55%

Captions

Microsoft

a group of people in a room 96.2%
a group of people sitting and standing in front of a window 86.3%
a group of people sitting in front of a window 82.2%

Text analysis

Amazon

12
SOCIAL
MJIR
XAOOX
MJIR YY37A2
YY37A2
absics

Google

MJI3 YT3R A2 XACOX 12
MJI3
12
XACOX
YT3R
A2