Human Generated Data

Title

Untitled (children at desks in classroom)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16849

Human Generated Data

Title

Untitled (children at desks in classroom)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 98.2
Person 98.2
Person 97.9
Person 97.7
Person 96.4
Person 95.7
Room 95.3
Indoors 95.3
School 93.6
Classroom 93.4
Furniture 90.9
Chair 90.9
Person 89.5
Person 88.4
Chair 84
Chair 80.4
Person 79.7
Workshop 78.6
Person 76.9
Person 68.8
People 63.9
Apparel 63.2
Clothing 63.2
Footwear 63.2
Shoe 63.2
Chair 59.8
Clinic 58
Cafeteria 56.1
Restaurant 56.1
Housing 55.1
Building 55.1

Imagga
created on 2022-02-26

room 49.3
classroom 44
chair 38.1
cafeteria 36.2
building 35.1
restaurant 29.4
table 28.7
interior 28.3
structure 27.7
modern 20.3
floor 19.5
people 19
seat 17.8
indoors 17.6
man 16.8
house 16.7
chairs 16.6
furniture 16.5
office 16.2
business 15.8
library 14.5
lifestyle 14.4
architecture 14.4
window 13.9
sitting 13.7
group 13.7
indoor 13.7
decor 13.3
glass 13.2
urban 13.1
city 12.5
design 12.4
inside 12
relaxation 11.7
wood 11.7
team 11.6
businessman 11.5
comfortable 11.5
male 11.3
home 11.2
men 11.2
women 11.1
person 10.8
tables 10.8
hospital 10.6
work 10.3
patio 10.2
dining 9.5
area 9.4
empty 9.4
happy 9.4
center 9.4
relax 9.3
hall 9
day 8.6
corporate 8.6
luxury 8.6
3d 8.5
meeting 8.5
counter 8.3
domestic 8.1
reflection 8.1
life 8.1
working 8
adult 7.9
school 7.9
teacher 7.8
elegance 7.6
professional 7.5
leisure 7.5
outdoors 7.5
training 7.4
executive 7.4
light 7.4
kitchen 7.3
worker 7.2
wooden 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

window 97.5
indoor 94.4
table 94.2
living 88.3
house 83.4
person 81.8
desk 72.9
clothing 71.1
black and white 69
text 67.1
chair 63
computer 53.8
furniture 41.4
family 22.8

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 63%
Calm 99.8%
Sad 0.1%
Happy 0%
Surprised 0%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Male, 98%
Calm 99.8%
Sad 0.1%
Angry 0.1%
Happy 0%
Disgusted 0%
Surprised 0%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.2%
Chair 90.9%
Shoe 63.2%

Captions

Microsoft

a group of people sitting around a living room 92.7%
a group of people in a living room filled with furniture and a window 89.9%
a group of people sitting at a table in a room 89.8%