Human Generated Data

Title

Untitled (man and woman sitting on beach)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19471

Human Generated Data

Title

Untitled (man and woman sitting on beach)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19471

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 99.3
Clothing 94.9
Apparel 94.9
Face 91.8
Blonde 88.7
Teen 88.7
Kid 88.7
Child 88.7
Female 88.7
Girl 88.7
Woman 88.7
Nature 86.5
Outdoors 85.5
Chair 78.6
Furniture 78.6
Portrait 71.1
Photography 71.1
Photo 71.1
People 69.1
Ground 61.5
Countryside 60.6
Housing 60.1
Building 60.1
Grass 57.9
Plant 57.9
Mammal 56
Animal 56

Clarifai
created on 2023-10-22

people 99.9
wedding 98.6
portrait 98.4
adult 97.1
monochrome 97
woman 96.8
two 96.8
bride 96.5
wear 95.5
man 95
girl 92.7
group 92.4
dress 90.2
family 90
group together 89.9
three 89.7
groom 88.4
child 87.2
street 86
actress 84.7

Imagga
created on 2022-03-05

world 26.7
person 22
people 20.6
man 18.8
dark 17.5
portrait 17.5
adult 14.4
male 14.3
silhouette 13.2
model 12.4
love 11.8
black 11.4
happy 11.3
sunset 10.8
groom 10.7
water 10.7
night 10.7
fashion 10.6
attractive 10.5
one 10.5
forest 10.4
hair 10.3
lifestyle 10.1
old 9.8
sexy 9.6
body 9.6
couple 9.6
scene 9.5
rain 9.4
happiness 9.4
face 9.2
sensuality 9.1
child 9.1
dress 9
sky 8.9
rustic 8.8
enjoy 8.5
kin 8.5
outdoor 8.4
park 8.3
human 8.2
sensual 8.2
lady 8.1
wet 8.1
light 8
smile 7.8
darkness 7.8
mood 7.8
men 7.7
fear 7.7
elegant 7.7
pretty 7.7
two 7.6
elegance 7.6
passion 7.5
protection 7.3
religion 7.2
art 7.2
romance 7.1
romantic 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 99.8
outdoor 96.7
black and white 89.8
human face 80.3
clothing 77.2
person 73.4
black 69.4
white 68.9
old 56.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 40-48
Gender Female, 93.9%
Calm 55.5%
Happy 23.2%
Surprised 11%
Fear 4.1%
Angry 2.4%
Sad 1.7%
Disgusted 1.3%
Confused 0.9%

AWS Rekognition

Age 31-41
Gender Female, 69.1%
Calm 59%
Happy 39.9%
Disgusted 0.3%
Sad 0.3%
Confused 0.2%
Surprised 0.2%
Fear 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.5%
Person 99.3%

Text analysis

Amazon

Coca-Cola
sai

Google

Ca-Cola NAGON-YT37A2-NAMTZA3
Ca-Cola
NAGON-YT37A2-NAMTZA3