Human Generated Data

Title

Untitled (two girls with pumpkin)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17011

Human Generated Data

Title

Untitled (two girls with pumpkin)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17011

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 100
Person 99.2
Human 99.2
Person 84.3
Person 79.2
Chair 73.1
Couch 71.1
Silhouette 59.6
Cradle 58.8
Sunglasses 57
Accessories 57
Accessory 57
Meal 56.9
Food 56.9
Collage 56
Advertisement 56
Poster 56

Clarifai
created on 2023-10-29

people 99.6
monochrome 99.5
street 95.9
woman 95.2
adult 94.1
girl 94.1
portrait 93.4
man 92.6
black and white 92.6
art 91.1
vintage 90.9
child 89.8
wear 88.2
furniture 88
wedding 87.9
square 84.9
dancer 84.6
shadow 82
group 81.6
music 80.6

Imagga
created on 2022-02-26

negative 34.1
film 28.3
photographic paper 21.9
black 20.7
musical instrument 18.8
keyboard instrument 18.6
man 17.5
piano 16.6
photographic equipment 14.9
adult 14.9
person 14
people 13.9
silhouette 13.2
cradle 11.9
dress 11.7
male 11.3
stringed instrument 11.3
furniture 11.2
dark 10.9
grand piano 10.9
light 10.7
car 10.5
percussion instrument 10.3
hair 10.3
accordion 10.2
baby bed 10
fashion 9
night 8.9
sitting 8.6
business 8.5
portrait 8.4
studio 8.4
fun 8.2
one 8.2
device 8.2
model 7.8
seat 7.7
suit 7.5
human 7.5
wind instrument 7.3
sexy 7.2
art 7.2
religion 7.2
face 7.1

Microsoft
created on 2022-02-26

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 16-22
Gender Female, 80.6%
Calm 95.4%
Fear 1.6%
Sad 1.2%
Happy 0.8%
Surprised 0.4%
Disgusted 0.3%
Angry 0.2%
Confused 0.1%

AWS Rekognition

Age 16-24
Gender Female, 51.9%
Sad 71.8%
Happy 9.5%
Fear 5.8%
Angry 4.6%
Calm 4.3%
Disgusted 1.8%
Surprised 1.5%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Feature analysis

Amazon

Person
Chair
Couch
Sunglasses
Person 99.2%
Person 84.3%
Person 79.2%
Chair 73.1%
Couch 71.1%

Captions

Microsoft
created on 2022-02-26

an old photo of a person 51.4%
an old photo of a person 51.3%
old photo of a person 47.5%

Text analysis

Amazon

8
ХАЛОЖ
YT39A2 ХАЛОЖ
YT39A2