Human Generated Data

Title

Untitled (woman sitting on beach by fence)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19473

Human Generated Data

Title

Untitled (woman sitting on beach by fence)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19473

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.7
Apparel 99.7
Person 99.3
Human 99.3
Hat 77.9
Overcoat 69.5
Coat 69.5
Silhouette 67.7
Sun Hat 66.1

Clarifai
created on 2023-10-22

people 99.5
art 99.5
portrait 98.2
wear 97.8
monochrome 97.8
man 96.8
lid 95.8
one 95.4
furniture 92.5
adult 91.8
woman 90.5
black and white 89.5
vintage 89.5
seat 89.2
veil 88.9
analogue 87.8
rain 87.2
street 86.7
sitting 86.5
sit 85.4

Imagga
created on 2022-03-05

silhouette 29.8
person 23.7
dark 20.9
sunset 19.8
man 19.5
adult 19.1
people 19
body 17.6
sensuality 16.4
model 15.6
sexy 15.3
fashion 14.3
male 13.6
one 13.4
posing 13.3
light 12.7
attractive 12.6
erotic 12.5
black 12.4
portrait 12.3
passion 12.2
night 11.5
water 11.3
style 11.1
world 11.1
hair 11.1
sport 10.8
sun 10.7
lady 10.6
covering 10.4
beach 10.2
sky 10.2
cell 10
sensual 10
exercise 10
shadow 9.9
human 9.8
seductive 9.6
love 9.5
sunrise 9.4
ocean 9.3
studio 9.1
pretty 9.1
sea 8.6
pleasure 8.5
skin 8.5
landscape 8.2
dress 8.1
wet 8
cool 8
boy 7.8
solitude 7.7
outdoor 7.6
balance 7.6
hot 7.5
enjoy 7.5
happy 7.5
evening 7.5
jacket 7.4
makeup 7.3
alone 7.3
fitness 7.2
face 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.6
outdoor 96.9
person 92.1
black and white 89.1
clothing 87.1
monochrome 72.7
man 69.3
human face 55.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 63.8%
Happy 42.6%
Calm 23.6%
Sad 14.7%
Fear 13.3%
Confused 2.4%
Surprised 1.5%
Disgusted 1.4%
Angry 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%

Captions

Text analysis

Amazon

YT33A2
MAMT2A3
FRODVK