Human Generated Data

Title

Untitled (woman and two babies on bed)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16826

Human Generated Data

Title

Untitled (woman and two babies on bed)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16826

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 99.2
Human 99.2
Person 98.4
Person 98.1
Play 93.7
Baby 93
Spoon 85.6
Cutlery 85.6
Face 70.6
Indoors 66.2
Kid 64.6
Child 64.6
Person 63.3
Floor 60.5
Room 57.8

Clarifai
created on 2023-10-29

people 99.8
monochrome 99.2
child 99
two 97.4
adult 95.4
man 94.6
portrait 93.6
woman 93.1
street 90.4
group 90.4
son 88.8
baby 88.2
family 87.6
wear 87.3
facial expression 82.7
love 82.5
offspring 82.1
music 80.9
girl 80.4
elderly 80.3

Imagga
created on 2022-02-26

person 30.5
man 27.5
adult 24
people 22.3
male 20.9
teacher 20.6
happy 15.7
world 14.3
portrait 14.2
child 13.4
smiling 13
black 12.6
room 12.5
educator 12.2
men 12
lifestyle 11.6
active 11.2
home 11.2
sport 10.9
hand 10.6
equipment 10.6
human 10.5
sitting 10.3
casual 10.2
exercise 10
patient 9.9
fitness 9.9
holding 9.9
businessman 9.7
grandfather 9.7
professional 9.5
happiness 9.4
senior 9.4
smile 9.3
board 9.2
business 9.1
school 9
kid 8.9
grandma 8.7
boy 8.7
work 8.6
blackboard 8.5
two 8.5
health 8.3
clothing 8.3
alone 8.2
case 8.2
dad 8.1
dress 8.1
cheerful 8.1
looking 8
athlete 7.9
standing 7.8
high 7.8
education 7.8
play 7.7
classroom 7.7
kin 7.7
modern 7.7
strength 7.5
fun 7.5
outdoors 7.5
one 7.5
planner 7.4
sick person 7.4
student 7.2
body 7.2
hair 7.1
worker 7.1
face 7.1
parent 7.1
interior 7.1
chair 7.1
day 7.1
indoors 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.2
toddler 96.6
clothing 95
person 94.7
baby 91.7
human face 74.8
child 70.1
black and white 70.1
boy 60.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 42-50
Gender Male, 86.8%
Calm 77.7%
Happy 9.7%
Surprised 8.1%
Sad 2%
Confused 1.1%
Disgusted 0.9%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 16-24
Gender Female, 99.9%
Surprised 53.6%
Happy 20.5%
Sad 10.6%
Fear 8.7%
Angry 3.1%
Disgusted 2%
Calm 0.9%
Confused 0.6%

AWS Rekognition

Age 6-12
Gender Male, 50.8%
Calm 80.2%
Surprised 9.4%
Fear 9.3%
Disgusted 0.3%
Happy 0.3%
Angry 0.3%
Sad 0.1%
Confused 0.1%

Feature analysis

Amazon

Person
Spoon
Person 99.2%
Person 98.4%
Person 98.1%
Person 63.3%
Spoon 85.6%

Captions

Microsoft
created on 2022-02-26

a person posing for the camera 84.1%
a man and a woman posing for a photo 37.8%

Text analysis

Amazon

18
ПРАЗ-ХАС

Google

78
78