Human Generated Data

Title

Untitled (two girls and baby combing hair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17431

Human Generated Data

Title

Untitled (two girls and baby combing hair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17431

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.1
Human 98.1
Person 95.6
Interior Design 95.4
Indoors 95.4
Clothing 92.9
Apparel 92.9
Person 91.5
Person 88
Face 87.5
Person 81.5
Floor 78.5
Door 78.2
Baby 77.7
Furniture 76.5
Room 72
Kid 71.7
Child 71.7
Female 70.8
Person 70.4
Portrait 67.4
Photography 67.4
Photo 67.4
Girl 64.6
People 64
Housing 60.6
Building 60.6
Play 59.2
Bed 58.8

Clarifai
created on 2023-10-29

people 99.7
monochrome 99.1
child 97.5
group 96.3
adult 95.4
group together 95.1
man 94.1
woman 92.3
wear 86.9
nostalgia 85.3
boy 85.1
portrait 84.3
family 83.7
humor 82.7
indoors 81.8
recreation 81.1
sitting 80.5
two 79.1
music 79.1
three 78.7

Imagga
created on 2022-02-26

man 28.9
people 24.5
person 22.5
male 19.9
equipment 19.8
adult 19.5
device 14.8
clothing 14.3
worker 14.3
work 14.1
professional 13.7
portrait 12.9
human 12.7
black 12.6
health 12.5
patient 11.7
sexy 11.2
hair 11.1
women 11.1
lifestyle 10.8
sports equipment 10.8
face 10.6
interior 10.6
working 10.6
body 10.4
men 10.3
happy 10
sport 10
holding 9.9
modern 9.8
fashion 9.8
job 9.7
looking 9.6
home 9.6
doctor 9.4
casual 9.3
boxing glove 9.2
mask 9.1
business 9.1
attractive 9.1
dress 9
medical 8.8
urban 8.7
life 8.7
covering 8.7
hospital 8.5
youth 8.5
room 8.5
blond 8.4
hand 8.3
city 8.3
occupation 8.2
makeup 8.2
clinic 8.2
healthy 8.2
exercise 8.2
technology 8.2
nurse 8.1
brassiere 8
medicine 7.9
look 7.9
instrument 7.9
sitting 7.7
surgeon 7.7
two 7.6
child 7.6
head 7.6
legs 7.5
house 7.5
leisure 7.5
one 7.5
machine 7.5
boxing equipment 7.4
training 7.4
weight 7.4
indoor 7.3
consumer goods 7.1
indoors 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

black and white 89.4
person 85.6
text 82.9
clothing 82.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-51
Gender Male, 71.1%
Happy 92.3%
Calm 5.3%
Sad 1.1%
Surprised 0.4%
Fear 0.3%
Disgusted 0.2%
Confused 0.2%
Angry 0.1%

AWS Rekognition

Age 12-20
Gender Male, 95.2%
Calm 89.3%
Sad 8.3%
Fear 0.9%
Surprised 0.4%
Disgusted 0.3%
Happy 0.3%
Confused 0.2%
Angry 0.2%

AWS Rekognition

Age 18-26
Gender Female, 54.4%
Calm 99.2%
Surprised 0.6%
Disgusted 0%
Confused 0%
Happy 0%
Sad 0%
Angry 0%
Fear 0%

Feature analysis

Amazon

Person
Person 98.1%
Person 95.6%
Person 91.5%
Person 88%
Person 81.5%
Person 70.4%

Categories

Captions

Microsoft
created on 2022-02-26

a man doing a trick on a skateboard 30.2%

Text analysis

Amazon

47
KODOK-EVEELA
HMEV