Human Generated Data

Title

Untitled (woman sitting on stone steps)

Date

c. 1950

People

Artist: Boston Herald,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19439

Human Generated Data

Title

Untitled (woman sitting on stone steps)

People

Artist: Boston Herald,

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19439

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Bonnet 99
Hat 99
Person 99
Human 99
Tarmac 98.9
Asphalt 98.9
Road 95.8
Pedestrian 69.3
Zebra Crossing 68.2
Shoe 58.3
Footwear 58.3
People 55.4

Clarifai
created on 2023-10-22

people 99.8
adult 96.9
child 96.4
one 94
wear 92.7
portrait 87
woman 84.5
education 84.5
group together 84.1
school 83.8
street 81.3
man 81.3
boy 80.1
two 79.8
recreation 79.6
sports equipment 77.1
lid 76.8
retro 76.4
art 76.3
outfit 75.5

Imagga
created on 2022-03-05

person 30.7
model 25.7
fashion 25.6
adult 25.4
body 23.2
style 23
sexy 22.5
hair 21.4
posing 20.4
sensuality 20
attractive 19.6
people 19.5
portrait 19.4
one 19.4
studio 19
black 18.9
dark 18.4
human 18
sport 17.1
dancer 17.1
dress 16.3
lifestyle 15.9
performer 15.6
man 15.5
erotic 14.3
legs 14.2
pretty 14
pose 13.6
elegance 13.4
exercise 12.7
skin 12.7
fitness 12.6
lady 12.2
dance 11.7
clothing 11.2
sitting 11.2
motion 11.1
women 11.1
nude 10.7
face 10.7
performance 10.5
action 10.2
casual 10.2
male 9.9
athlete 9.8
wall 9.6
passion 9.4
water 9.3
slim 9.2
teenager 9.1
sensual 9.1
fun 9
cool 8.9
suit 8.8
brunette 8.7
active 8.5
health 8.3
city 8.3
vintage 8.3
gorgeous 8.2
happy 8.1
aerobics 7.8
jumping 7.7
naked 7.7
desire 7.7
moving 7.6
balance 7.6
healthy 7.6
enjoy 7.5
mask 7.5
leg 7.5
outdoors 7.5
device 7.5
street 7.4
makeup 7.3
alone 7.3
world 7.3
make 7.3
wet 7.2

Microsoft
created on 2022-03-05

text 97.9
outdoor 92.7
clothing 90.4
person 87.8
black and white 85.2
young 84.9
footwear 84.9
street 83.6
woman 56.5
posing 35.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 94%
Calm 64.9%
Confused 13.9%
Surprised 6.6%
Disgusted 5.4%
Sad 4.1%
Angry 2.3%
Happy 1.6%
Fear 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99%
Shoe 58.3%

Captions

Microsoft
created on 2022-03-05

a young boy standing in front of a building 36.8%

Text analysis

Amazon

or
XAOOX