Human Generated Data

Title

Untitled (children on couch with telephone)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17179

Human Generated Data

Title

Untitled (children on couch with telephone)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17179

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 97.8
Human 97.8
Clothing 95.4
Apparel 95.4
Person 94.7
Dress 94.5
Camera 91.5
Electronics 91.5
Face 85.3
Photographer 73.1
Kid 69.5
Child 69.5
Girl 68.6
Female 68.6
Photography 65.4
Photo 65.4
Portrait 63.8
Shorts 59.5

Clarifai
created on 2023-10-28

people 99.9
child 99.4
two 98.8
group 98.6
son 96.2
monochrome 96
wear 94.8
man 94.6
three 93.2
facial expression 92.5
outfit 92.3
portrait 92
adult 91.5
one 90.4
recreation 90.1
family 90
music 89.9
several 89.6
nostalgia 87.6
boy 87.6

Imagga
created on 2022-02-26

sibling 22.4
people 20.6
senior 20.6
man 20.3
person 20
brass 19.8
happy 19.4
portrait 19.4
child 18.5
old 17.4
male 16.5
adult 16.2
family 15.1
wind instrument 14.6
elderly 14.3
home 14.3
hair 14.2
smiling 13.7
human 13.5
mother 13.4
mature 13
black 12.6
retired 12.6
happiness 12.5
smile 12.1
love 11.8
musical instrument 11.4
face 11.4
fun 11.2
world 10.9
kin 10.6
daughter 10.5
couple 10.4
looking 10.4
grandma 10.4
lifestyle 10.1
head 9.2
playing 9.1
health 9
childhood 8.9
grandmother 8.8
older 8.7
parent 8.7
women 8.7
retirement 8.6
day 8.6
eyes 8.6
casual 8.5
joy 8.3
negative 8.3
holding 8.2
gray 8.1
active 8.1
little 7.9
look 7.9
film 7.7
hand 7.6
laughing 7.6
leisure 7.5
outdoors 7.5
glasses 7.4
life 7.3
body 7.2
cute 7.2
science 7.1
medical 7.1
blond 7
modern 7
together 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

human face 92.6
text 89.9
person 85.2
black and white 75.1
clothing 71.7
drawing 53.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 7-17
Gender Female, 90.8%
Calm 65%
Surprised 22.8%
Happy 4.7%
Disgusted 2%
Confused 1.6%
Sad 1.5%
Angry 1.3%
Fear 1.1%

AWS Rekognition

Age 24-34
Gender Female, 83.7%
Surprised 93.7%
Happy 4.6%
Fear 0.8%
Calm 0.5%
Angry 0.2%
Disgusted 0.1%
Sad 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 97.8%
Person 94.7%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

MJI7
MJI7 YT3RAS
YT3RAS

Google

MJ7 YT3RA2 0A
MJ7
YT3RA2
0A