Human Generated Data

Title

Untitled (woman tossing coin into fountain)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21565

Human Generated Data

Title

Untitled (woman tossing coin into fountain)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21565

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Clothing 93.1
Apparel 93.1
Table Lamp 88.2
Lamp 88.2
Shoe 82.4
Footwear 82.4
Interior Design 69.1
Indoors 69.1
Female 61.6
Furniture 59.8
Tub 57.2
Table 55.7
Housing 55
Building 55

Clarifai
created on 2023-10-22

people 99.9
one 99.1
adult 98.6
woman 98.3
wear 95.9
monochrome 95.6
two 94.1
window 93.3
portrait 93.1
street 92.5
home 90.7
indoors 90.1
administration 89.6
actress 87.6
leader 87.2
room 87.1
commerce 86.8
music 86.1
fashion 83.8
dress 82.4

Imagga
created on 2022-03-05

person 30.6
adult 30.1
fashion 24.1
attractive 23.1
people 22.9
lady 21.1
sexy 20.9
pretty 20.3
smiling 20.3
clothing 18
lifestyle 17.4
black 17
portrait 16.8
cute 16.5
smile 16.4
sensuality 16.4
model 16.3
posing 16
blond 15.9
hair 15.9
dress 15.4
elegance 15.1
exercise 14.5
happy 14.4
body 14.4
legs 14.2
indoors 14.1
cleaner 13.9
sitting 13.7
fitness 13.6
women 13.4
one 13.4
skirt 13
sport 12.9
human 12.7
elegant 12
fit 12
street 12
modern 11.9
style 11.9
sensual 11.8
man 11.4
cheerful 11.4
brunette 11.3
casual 11
happiness 11
city 10.8
active 10.8
interior 10.6
gym 10.5
cover girl 10.5
wall 10.3
expression 10.2
shopping 10.1
miniskirt 10
urban 9.6
standing 9.6
face 9.2
dancer 9.2
leisure 9.1
bathroom 9.1
health 9
life 8.9
lovely 8.9
looking 8.8
outfit 8.6
outdoor 8.4
training 8.3
slim 8.3
holding 8.3
garment 8.2
smasher 8.1
male 8.1
sexual 7.7
outside 7.7
joy 7.5
lips 7.4
business 7.3
pose 7.2
color 7.2

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 96.5
black and white 95.9
clothing 94.8
text 82.6
monochrome 77.2
street 76.9
footwear 71.5
sink 51.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 99.8%
Happy 54.4%
Calm 30.1%
Sad 10.3%
Disgusted 1.2%
Surprised 1.1%
Fear 1.1%
Angry 0.9%
Confused 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.7%
Shoe 82.4%

Categories