Human Generated Data

Title

Untitled (three sisters seated together at home)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16976

Human Generated Data

Title

Untitled (three sisters seated together at home)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16976

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.1
Human 98.1
Person 95.2
Person 92.7
Clothing 91.6
Apparel 91.6
Furniture 80.5
Female 71.8
Dress 68.9
Tub 65.1
Toy 64.1
Girl 61.7
Chair 57.9
Bathtub 56.7
Floor 56.7
Figurine 56

Clarifai
created on 2023-10-29

people 99.3
child 98.6
monochrome 98.5
family 96.4
woman 94
girl 93.4
group 93.3
indoors 91.1
man 89.9
adult 89.7
wedding 86.4
portrait 85.6
three 84.9
baby 84
sit 81.1
two 81.1
son 79.9
interaction 78.5
love 78.1
smile 76

Imagga
created on 2022-02-26

automaton 34.5
person 25.4
mask 22.9
people 21.7
adult 18.8
man 18.1
equipment 16.8
male 16.3
health 15.3
black 15
body 14.4
protective covering 14
men 13.7
women 13.4
lifestyle 13
face 12.8
toilet tissue 12.5
film 11.6
holding 11.5
covering 11.3
human 11.2
work 11
exercise 10.9
smiling 10.8
weight 10.7
gym 10.5
negative 10.5
portrait 10.3
device 10
tissue 9.9
machine 9.7
medical 9.7
looking 9.6
home 9.6
life 9.4
training 9.2
leisure 9.1
hospital 9
team 9
technology 8.9
child 8.8
professional 8.5
indoor 8.2
photographic paper 8.2
fitness 8.1
science 8
working 7.9
medicine 7.9
indoors 7.9
happiness 7.8
dumbbell 7.8
exercising 7.7
room 7.6
fashion 7.5
ball 7.5
care 7.4
sports equipment 7.4
helmet 7.3
sexy 7.2
plaything 7.2
active 7.2
clothing 7.2
hair 7.1
worker 7.1
job 7.1
interior 7.1

Microsoft
created on 2022-02-26

text 97.5
black and white 88.1
indoor 86.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Female, 99.5%
Surprised 88.9%
Fear 8.7%
Calm 1.7%
Happy 0.3%
Disgusted 0.1%
Angry 0.1%
Confused 0.1%
Sad 0.1%

AWS Rekognition

Age 24-34
Gender Female, 99.2%
Surprised 52.4%
Fear 21%
Happy 17.9%
Calm 4.6%
Disgusted 1.5%
Confused 1.2%
Angry 0.8%
Sad 0.6%

AWS Rekognition

Age 30-40
Gender Female, 74.3%
Surprised 97.1%
Calm 1.8%
Angry 0.5%
Fear 0.3%
Disgusted 0.1%
Happy 0.1%
Confused 0.1%
Sad 0.1%

Feature analysis

Amazon

Person
Person 98.1%
Person 95.2%
Person 92.7%

Categories

Captions

Microsoft
created on 2022-02-26

a person sitting on a table 33.3%

Text analysis

Amazon

RODAR-SELA

Google

KODVR- eVEELA
KODVR-
eVEELA