Human Generated Data

Title

Untitled (seated women with children)

Date

1946

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15479

Human Generated Data

Title

Untitled (seated women with children)

People

Artist: Jack Gould, American

Date

1946

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15479

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Person 99.3
Human 99.3
Person 99.1
Person 97.3
Chair 96
Furniture 96
Shoe 95
Footwear 95
Person 94.7
Person 93.2
Shorts 86.2
Person 84.8
Face 77.2
Shoe 76.1
Guitar 74.6
Musical Instrument 74.6
Leisure Activities 74.6
Female 71.6
People 66.6
Portrait 66.3
Photography 66.3
Photo 66.3
Hat 65.3
Girl 64.6
Kid 61.9
Child 61.9
Drawing 59.5
Art 59.5
Table 58.5
Indoors 56.4
Woman 56.2
Sitting 55.2

Clarifai
created on 2023-10-29

people 99.9
group 98.9
monochrome 98.8
adult 97.9
man 97.6
woman 97.1
child 96.3
group together 95.4
wear 94.3
dancing 92.4
dancer 91.8
family 91
portrait 90.3
music 89.7
recreation 87.1
actor 85.5
boy 84.6
adolescent 83.4
art 83.3
education 82.7

Imagga
created on 2022-03-05

person 32.3
people 32.3
man 28.9
teacher 28.7
indoors 27.2
home 27.1
adult 26.6
room 24.1
male 22
interior 20.3
women 19.8
sitting 19.7
smiling 18.8
happy 18.8
lifestyle 18.8
blackboard 18.6
chair 18
house 17.5
professional 17.4
classroom 17
laptop 16.8
indoor 16.4
looking 16
portrait 15.5
table 15.2
education 14.7
computer 14.5
casual 14.4
educator 14.1
student 13.7
group 13.7
mature 13
kin 13
office 12.8
business 12.7
class 12.5
men 12
mother 12
school 11.6
family 11.6
businessman 11.5
black 11.4
couple 11.3
study 11.2
child 11
desk 10.7
teaching 10.7
smile 10.7
working 10.6
studying 10.5
modern 10.5
work 10.3
senior 10.3
life 10.3
back 10.1
board 9.9
holding 9.9
kid 9.7
two people 9.7
together 9.6
standing 9.6
clothing 9.5
happiness 9.4
furniture 9.2
alone 9.1
children 9.1
love 8.7
hospital 8.6
face 8.5
meeting 8.5
pretty 8.4
grandfather 8.3
fashion 8.3
girls 8.2
sibling 8.2
team 8.1
grandma 8
window 7.7
mid adult 7.7
test 7.7
exam 7.7
blond 7.6
togetherness 7.5
horizontal 7.5
human 7.5
one 7.5
worker 7.4
light 7.3
cheerful 7.3
lady 7.3
sexy 7.2
hair 7.1
job 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.4
book 95.4
clothing 93.4
person 85.4
footwear 74.5
drawing 71.7
posing 61.7
female 25

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Female, 93.5%
Sad 39.3%
Happy 32.7%
Surprised 14.2%
Calm 5.3%
Confused 4.3%
Fear 1.7%
Disgusted 1.4%
Angry 1%

AWS Rekognition

Age 24-34
Gender Female, 91.8%
Calm 55.7%
Sad 12.8%
Surprised 10%
Confused 7.5%
Disgusted 5.3%
Happy 3.6%
Angry 2.8%
Fear 2.3%

AWS Rekognition

Age 48-54
Gender Female, 95.4%
Surprised 93.1%
Happy 3.3%
Calm 2.7%
Sad 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person
Shoe
Guitar
Person 99.3%
Person 99.1%
Person 97.3%
Person 94.7%
Person 93.2%
Person 84.8%
Shoe 95%
Shoe 76.1%
Guitar 74.6%