Human Generated Data

Title

Untitled (mother and three children)

Date

1920s

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.820

Human Generated Data

Title

Untitled (mother and three children)

People

Artist: Bachrach Studios, founded 1868

Date

1920s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.820

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.4
Human 99.4
Person 99.3
Person 98.5
Clothing 97.4
Apparel 97.4
Person 95.9
People 94.1
Grass 94
Plant 94
Meal 94
Food 94
Family 86.7
Tree 83.5
Shorts 79.5
Outdoors 78.2
Leisure Activities 74.7
Picnic 74.7
Vacation 74.7
Female 66.6
Dress 58.7
Photo 58.1
Photography 58.1
Hand 56.8
Person 54.8

Clarifai
created on 2023-10-25

people 100
child 99.1
two 98.2
man 97.7
family 97.6
wedding 94.9
offspring 94.2
woman 93.7
group 93.3
leader 92.2
love 91.5
girl 90.7
portrait 90.6
adult 90.3
sepia 89.7
son 89.6
group together 88.2
three 87.9
sibling 87
four 86.5

Imagga
created on 2022-01-09

outdoors 23.8
people 22.9
man 22.9
adult 22.1
male 20
summer 17.4
person 17
outdoor 16.8
world 16.1
child 15.6
beach 15.3
park 14.8
lifestyle 14.5
portrait 14.2
day 14.1
mechanical device 13.5
sunset 13.5
women 13.4
parent 13.2
sport 12.8
active 12.6
swing 12.3
sky 12.2
couple 12.2
happy 11.9
mother 11.4
men 11.2
groom 11.1
health 11.1
love 11.1
happiness 11
tree 10.9
exercise 10.9
model 10.9
field 10.9
fitness 10.8
leisure 10.8
attractive 10.5
athlete 10.5
mechanism 10.1
kin 9.8
fashion 9.8
pretty 9.8
sun 9.7
sunny 9.5
outside 9.4
smiling 9.4
dress 9
sexy 8.8
sand 8.8
autumn 8.8
country 8.8
grass 8.7
dad 8.6
youth 8.5
two 8.5
free 8.5
hand 8.4
joy 8.4
sprinkler 8.3
freedom 8.2
cheerful 8.1
lady 8.1
meadow 8.1
farm 8
hair 7.9
boy 7.8
sitting 7.7
fun 7.5
fit 7.4
teenager 7.3
danger 7.3
black 7.2
body 7.2
cute 7.2
activity 7.2
romance 7.1
smile 7.1
face 7.1
work 7.1
rural 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 99.9
tree 99.8
grass 98.3
clothing 89.9
person 89.5
old 83.8
text 79.5
woman 53.4
posing 52.3
vintage 30.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 7-17
Gender Female, 73.8%
Calm 90.1%
Happy 7.3%
Confused 0.7%
Sad 0.6%
Angry 0.5%
Surprised 0.3%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 7-17
Gender Female, 100%
Happy 97.5%
Calm 1%
Surprised 0.3%
Sad 0.3%
Angry 0.2%
Confused 0.2%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 98.2%
Sad 1%
Angry 0.2%
Happy 0.2%
Confused 0.1%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

AWS Rekognition

Age 11-19
Gender Female, 59.4%
Calm 94.9%
Confused 2.5%
Angry 1.1%
Surprised 0.5%
Fear 0.3%
Happy 0.2%
Disgusted 0.2%
Sad 0.2%

Microsoft Cognitive Services

Age 20
Gender Female

Microsoft Cognitive Services

Age 13
Gender Male

Microsoft Cognitive Services

Age 20
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 97.1%
interior objects 1.9%

Text analysis

Amazon

Bachrech