Human Generated Data

Title

Untitled (family portrait in garden)

Date

1920s

People

Artist: Bachrach Studios, founded 1868

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.601

Human Generated Data

Title

Untitled (family portrait in garden)

People

Artist: Bachrach Studios, founded 1868

Date

1920s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.601

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Person 99.3
Person 99.1
Furniture 97
Person 96.5
Clothing 84.6
Apparel 84.6
People 77.9
Art 69
Bench 61.4
Sitting 60
Chair 56.9
Senior Citizen 55.3

Clarifai
created on 2023-10-26

people 99.8
child 99.2
sepia 98.8
family 98.4
man 98.2
portrait 97.9
son 97.4
woman 97
two 96.2
documentary 95.9
adult 95.5
wear 94.6
nostalgia 94.6
offspring 93.9
furniture 93.4
vintage 93
retro 92.9
affection 92.1
girl 90.7
facial expression 89.8

Imagga
created on 2022-01-23

parent 32.9
child 30.9
mother 29.6
dad 23.7
people 22.9
man 22.8
father 22.4
portrait 18.8
world 18.5
male 16.1
love 15.8
old 15.3
happiness 14.9
military uniform 13.9
family 13.3
park 13.2
clothing 13.1
baby 12.9
face 12.8
sepia 12.6
adult 12.4
couple 12.2
black 12
human 12
hair 11.9
neonate 11.8
bride 11.5
happy 11.3
youth 11.1
person 10.7
uniform 10.7
fashion 10.5
kin 10.2
dress 9.9
pretty 9.8
eyes 9.5
model 9.3
cute 9.3
head 9.2
wedding 9.2
girls 9.1
one 9
kid 8.9
sexy 8.8
women 8.7
boy 8.7
antique 8.7
men 8.6
statue 8.6
smile 8.6
sibling 8.5
two 8.5
attractive 8.4
vintage 8.3
lady 8.1
smiling 8
life 7.9
ancient 7.8
marble 7.7
married 7.7
relaxation 7.5
future 7.4
classic 7.4
historic 7.3
cheerful 7.3
room 7.3
playing 7.3
looking 7.2
art 7.2
romance 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

gallery 98.7
clothing 97.1
room 96.9
person 96.7
scene 94.8
text 79.5
posing 77
woman 65.6
man 60.9
old 48.9
picture frame 12.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Male, 100%
Calm 99.6%
Confused 0.2%
Sad 0.1%
Angry 0%
Happy 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 54-64
Gender Male, 99.9%
Calm 62.2%
Confused 14%
Happy 13.5%
Sad 6.9%
Fear 1.5%
Angry 0.7%
Disgusted 0.6%
Surprised 0.6%

AWS Rekognition

Age 18-26
Gender Male, 100%
Calm 88.8%
Angry 9.6%
Confused 0.5%
Sad 0.5%
Surprised 0.2%
Fear 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 54-62
Gender Female, 99.6%
Calm 91.4%
Sad 4.9%
Confused 2.8%
Fear 0.4%
Angry 0.3%
Surprised 0.1%
Happy 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Categories

Imagga

paintings art 100%

Captions