Human Generated Data

Title

On the Bowery

Date

1971

People

Artist: Edward Grazda, American born 1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.815

Copyright

© Edward Grazda

Human Generated Data

Title

On the Bowery

People

Artist: Edward Grazda, American born 1947

Date

1971

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of the artist, 2.2002.815

Copyright

© Edward Grazda

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Apparel 99.9
Clothing 99.9
Person 97.9
Human 97.9
Coat 88
Jacket 87
Shoe 85.3
Footwear 85.3
Finger 70.4
Pants 69.3
Overcoat 63.7
Face 61.1
Person 54.2

Clarifai
created on 2023-10-15

people 99.5
portrait 98.9
man 98
girl 97.1
adult 96.5
street 96.1
foot 96
woman 96
monochrome 95.8
wear 95.3
child 95.1
two 94.8
couple 94.6
group together 94.1
boy 93.7
coat 93.6
model 91.5
one 91
group 90.3
son 89.2

Imagga
created on 2021-12-14

sleeping bag 61.1
bag 47.1
leg 26.2
people 25.1
footwear 23.8
person 22.3
adult 20.1
shoe 19.9
fashion 18.8
black 17.8
man 17.5
jeans 17.2
male 17
lying 16.9
model 16.3
body 15.2
hair 15.1
clothing 14.5
smile 14.3
women 14.2
legs 14.2
men 13.7
youth 13.6
shoes 13.4
sport 13.4
loafer 12.8
casual 12.7
exercise 12.7
smiling 12.3
happy 11.9
relax 11.8
boots 11.7
leisure 11.6
cute 11.5
studio 11.4
human 11.2
floor 11.2
pose 10.9
style 10.4
rest 10.2
skate 10.2
hand 9.9
attractive 9.8
covering 9.6
sitting 9.4
outside 9.4
two 9.3
action 9.3
face 9.2
cadaver 9.2
pretty 9.1
fitness 9
fun 9
outdoors 9
one 9
posing 8.9
sexy 8.8
lie 8.8
boot 8.5
lady 8.1
working 8
lifestyle 7.9
love 7.9
couple 7.8
hands 7.8
portrait 7.8
eyes 7.7
outdoor 7.6
resting 7.6
skin 7.6
child 7.6
relaxation 7.5
street 7.4
alone 7.3
girls 7.3
relaxing 7.3
blond 7.2
skateboard 7.2
grass 7.1
splint 7.1
board 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 32-48
Gender Male, 96%
Calm 99.2%
Sad 0.3%
Surprised 0.2%
Fear 0.2%
Angry 0.1%
Happy 0.1%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 28-44
Gender Female, 98.5%
Fear 49.1%
Sad 48.3%
Surprised 0.8%
Calm 0.8%
Happy 0.5%
Angry 0.3%
Confused 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.9%
Shoe 85.3%

Categories

Captions