Human Generated Data

Title

Untitled (portrait of nun standing next to women seated in chairs)

Date

c. 1930

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4369

Human Generated Data

Title

Untitled (portrait of nun standing next to women seated in chairs)

People

Artist: Durette Studio, American 20th century

Date

c. 1930

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Chair 99.8
Furniture 99.8
Human 99.1
Person 99.1
Person 99.1
Footwear 98.4
Clothing 98.4
Shoe 98.4
Apparel 98.4
Person 95.8
Person 92.7
Shoe 92.4
Room 89.3
Indoors 89.3
Shoe 85.2
Floor 68.1
Photography 65.1
Photo 65.1
Flooring 64
Portrait 62.3
Face 62.3
People 62
Living Room 58.4
Crib 57.6
Cradle 57.2
Interior Design 56.2

Clarifai
created on 2019-06-01

people 99.9
group 99.5
group together 99.1
child 98.8
adult 97.2
woman 97
family 95.9
man 95.4
room 95.1
several 94.8
furniture 94.8
wear 94.3
many 93.9
indoors 93.4
education 93.3
five 92.5
school 90.5
boy 89.7
leader 86.7
home 86.5

Imagga
created on 2019-06-01

brass 37.1
people 31.8
wind instrument 31.4
musical instrument 30.2
man 28.9
men 26.6
business 26.1
businessman 25.6
male 25.5
person 25.2
adult 21.4
room 20.9
blackboard 20.3
chair 18.6
classroom 17.1
job 16.8
women 16.6
professional 16.2
group 16.1
trombone 16
office 15.3
work 15
silhouette 14.1
indoors 14.1
team 13.4
interior 13.3
meeting 13.2
corporate 12.9
worker 12.5
black 12.1
businesswoman 11.8
holding 11.6
fun 11.2
cornet 11
teacher 11
indoor 11
employee 10.7
crowd 10.6
boss 10.5
modern 10.5
executive 10.3
percussion instrument 10.3
device 10.2
youth 10.2
two 10.2
lifestyle 10.1
communication 10.1
suit 9.9
hand 9.9
handsome 9.8
to 9.7
hall 9.7
table 9.6
boy 9.6
laptop 9.3
leisure 9.1
happy 8.8
standing 8.7
portrait 8.4
attractive 8.4
teamwork 8.3
city 8.3
fashion 8.3
cheerful 8.1
furniture 8.1
looking 8
urban 7.9
day 7.8
couple 7.8
happiness 7.8
hands 7.8
students 7.8
casual 7.6
elegance 7.6
human 7.5
floor 7.4
sax 7.3
girls 7.3
computer 7.2
board 7.2
smiling 7.2
smile 7.1
working 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

floor 95.4
clothing 92.4
indoor 89.4
footwear 88.5
person 82.2
toddler 54.1
furniture 51.1

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.9%
Happy 45.9%
Disgusted 45.1%
Angry 45.2%
Surprised 45.2%
Sad 45.5%
Calm 53%
Confused 45.1%

AWS Rekognition

Age 26-44
Gender Female, 53.5%
Sad 46.7%
Angry 45.2%
Surprised 45.4%
Calm 51.7%
Disgusted 45.2%
Happy 45.5%
Confused 45.3%

AWS Rekognition

Age 35-52
Gender Male, 51.5%
Confused 45.4%
Angry 45.4%
Sad 53.9%
Calm 45.1%
Disgusted 45.1%
Happy 45.1%
Surprised 45.1%

Feature analysis

Amazon

Chair 99.8%
Person 99.1%
Shoe 98.4%

Captions

Microsoft

a group of people in a room 89.8%
a person standing in a room 86.5%
a person in a white room 76.8%