Human Generated Data

Title

Untitled (two boys with two dogs)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17567

Human Generated Data

Title

Untitled (two boys with two dogs)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17567

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Shorts 99.9
Clothing 99.9
Apparel 99.9
Person 99.6
Human 99.6
Brick 99.2
Person 99
Face 92.6
Female 87.2
Person 81.2
Helmet 80.9
Chair 79.7
Furniture 79.7
Shoe 78.3
Footwear 78.3
Girl 75.9
Smile 75.8
Kid 72.6
Child 72.6
Portrait 65.7
Photography 65.7
Photo 65.7
Woman 65.6
People 64.3
Outdoors 63.2
Canine 62.6
Animal 62.6
Mammal 62.6
Pet 60.9
Plant 60.6
Play 58.7
Crowd 56.2
Suit 56
Coat 56
Overcoat 56
Boy 55.8

Clarifai
created on 2023-10-28

people 100
child 99.8
group together 98
group 97.5
boy 97.4
adult 97.4
many 96
woman 95.7
wear 93.7
monochrome 93.1
man 93.1
two 91.7
recreation 91.2
war 91
several 90.1
home 90.1
education 89.6
five 87.9
son 87.4
school 86.9

Imagga
created on 2022-02-26

man 34.3
musical instrument 31.4
people 25.1
person 23.2
male 22.1
adult 20.6
men 19.7
percussion instrument 18.7
business 18.2
marimba 17.7
couple 17.4
businessman 16.8
lifestyle 15.9
black 15.1
happy 15
portrait 14.9
wind instrument 14.4
indoors 14
sitting 13.7
love 13.4
kin 13.2
room 13.1
women 12.6
brass 12.2
office 12
indoor 11.9
casual 11.9
smiling 11.6
interior 11.5
attractive 11.2
leisure 10.8
handsome 10.7
smile 10.7
fashion 10.5
together 10.5
pretty 10.5
sexy 10.4
home 10.4
chair 10.3
friends 10.3
youth 10.2
suit 9.9
family 9.8
cheerful 9.7
computer 9.6
stringed instrument 9.5
model 9.3
two 9.3
classroom 9.2
businesswoman 9.1
holding 9.1
fun 9
team 9
group 8.9
job 8.8
working 8.8
looking 8.8
married 8.6
happiness 8.6
world 8.3
child 8.3
laptop 8.3
silhouette 8.3
outdoors 8.2
work 8.2
musician 8
posing 8
life 7.8
shopping cart 7.8
clothing 7.6
communication 7.6
meeting 7.5
executive 7.5
city 7.5
music 7.4
professional 7.4
handcart 7.3
alone 7.3
singer 7.2
modern 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 98.7
window 93.3
text 92.7
black and white 80.9
clothing 58.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-16
Gender Male, 51%
Happy 94.1%
Surprised 3.4%
Calm 1.2%
Angry 0.4%
Disgusted 0.3%
Fear 0.3%
Sad 0.2%
Confused 0%

AWS Rekognition

Age 10-18
Gender Male, 99.6%
Happy 96.5%
Surprised 1.9%
Calm 0.7%
Fear 0.6%
Sad 0.1%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Helmet
Shoe
Person 99.6%
Person 99%
Person 81.2%
Helmet 80.9%
Shoe 78.3%