Human Generated Data

Title

Occupying Wall Street, November 3, 2011

Date

2011

People

Artist: Accra Shepp, American born 1962

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.4

Copyright

© Accra Shepp

Human Generated Data

Title

Occupying Wall Street, November 3, 2011

People

Artist: Accra Shepp, American born 1962

Date

2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, 2019.314.4

Copyright

© Accra Shepp

Machine Generated Data

Tags

Amazon
created on 2023-07-07

Body Part 100
Finger 100
Hand 100
Cap 99.9
Clothing 99.9
Person 99.6
Adult 99.6
Male 99.6
Man 99.6
Photography 99.6
Face 99.4
Head 99.4
Portrait 99.4
Footwear 98.8
Shoe 98.8
Musical Instrument 96.3
Guitar 94
Guitarist 93.4
Leisure Activities 93.4
Music 93.4
Musician 93.4
Performer 93.4
Hat 88.1
Machine 84
Wheel 84
Beanie 57.7
Electrical Device 56.7
Microphone 56.7
Sitting 55.3

Clarifai
created on 2023-10-13

people 99.8
one 99.5
portrait 99.4
man 98.7
music 98.3
guitar 98.3
adult 98.2
monochrome 98.2
stringed instrument 95.9
musician 94.8
street 93.2
two 92.3
sit 91.3
instrument 91
guitarist 89.6
indoors 89.1
wear 86.9
recreation 86.1
sitting 85.1
singer 82.8

Imagga
created on 2023-07-07

person 35.8
sitting 32.6
laptop 31.2
man 30.9
people 30.1
car 29.6
adult 27.2
male 25.6
happy 21.9
computer 21.7
smile 20.7
work 20.4
working 20.3
vehicle 19.9
attractive 19.6
automobile 19.2
interior 18.6
driver 18.5
pretty 18.2
smiling 18.1
portrait 16.8
job 16.8
seat 15.7
crutch 15.5
lifestyle 15.2
home 15.2
notebook 15.1
women 15
business 14.6
worker 14.5
model 14
staff 13.9
auto 13.4
men 12.9
fashion 12.8
transportation 12.6
lady 12.2
sexy 12.1
professional 12
one 11.9
indoors 11.4
technology 11.1
black 11
transport 11
office 10.9
chair 10.8
handsome 10.7
sofa 10.6
sit 10.4
luxury 10.3
room 10.2
casual 10.2
indoor 10
cheerful 9.8
device 9.7
driving 9.7
success 9.7
stick 9.6
hair 9.5
guy 9.5
drive 9.5
suit 9.4
phone 9.2
student 9.2
inside 9.2
style 8.9
wheel 8.6
elegant 8.6
face 8.5
youth 8.5
support 8.4
clothing 8.3
human 8.3
alone 8.2
outdoors 8.2
disk jockey 8.1
looking 8
businessman 7.9
couple 7.8
world 7.8
couch 7.7
modern 7.7
collar 7.7
two 7.6
reading 7.6
desk 7.6
one person 7.5
smart 7.5
leisure 7.5
portable computer 7.4
holding 7.4
relaxing 7.3
businesswoman 7.3
happiness 7.1

Google
created on 2023-07-07

Microsoft
created on 2023-07-07

person 98.8
sitting 98.7
clothing 96.9
man 88.2
black and white 82
music 81.9
human face 79.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Male, 87.8%
Calm 91.7%
Surprised 7.6%
Fear 6.1%
Sad 2.9%
Happy 2.4%
Angry 0.4%
Confused 0.3%
Disgusted 0.2%

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Adult 99.6%
Male 99.6%
Man 99.6%
Shoe 98.8%
Guitar 94%
Hat 88.1%
Wheel 84%

Categories

Captions

Microsoft
created on 2023-07-07

a man sitting on a bench 78.9%
a man sitting on a table 78.8%
a man sitting in a chair 78.7%

Text analysis

Amazon

hot

Google

Toroundl ASUKA 在
Toroundl
ASUKA