Human Generated Data

Title

Making Pittsburg Stogies

Date

1909, printed later?

People

Artist: Lewis Wickes Hine, American 1874 - 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Naomi and Walter Rosemblum, P1979.79

Human Generated Data

Title

Making Pittsburg Stogies

People

Artist: Lewis Wickes Hine, American 1874 - 1940

Date

1909, printed later?

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Naomi and Walter Rosemblum, P1979.79

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.8
Human 99.8
Person 97.4
Clothing 72
Apparel 72
Machine 66.6
Furniture 60.2
Gun 59.2
Weapon 59.2
Weaponry 59.2
Screen 58.4
Electronics 58.4
Pub 55.4
Bar Counter 55.2
Building 55.2
Finger 55.1

Clarifai
created on 2023-10-25

people 99.8
adult 98.4
monochrome 98
man 97.4
woman 96.2
bar 95.9
concentration 95
portrait 94.9
sit 93.4
street 93.4
two 93.1
room 91.6
boy 90
one 89.8
music 88.1
furniture 88.1
actor 87.1
musician 85.5
nightclub 84.9
recreation 83.5

Imagga
created on 2022-01-08

laptop 49.4
scholar 46.9
person 46
computer 43.8
intellectual 38.7
people 30.1
working 29.2
adult 27.2
business 26.1
office 26.1
work 25.9
man 25
notebook 22.6
technology 22.3
male 22.1
happy 20.7
job 19.5
keyboard 18.4
home 18.4
sitting 18.1
portable computer 17.7
communication 17.6
looking 17.6
smiling 17.4
businesswoman 17.3
desk 17
musical instrument 16.2
worker 16.1
indoors 15.8
attractive 15.4
smile 15
executive 14.8
portrait 14.2
face 14.2
child 14.2
pretty 14
television 13.9
student 13.8
lifestyle 13.7
suit 13.5
personal computer 13.5
one 13.4
professional 12.8
casual 12.7
hair 12.7
black 12.3
together 12.3
lady 12.2
mature 12.1
corporate 12
women 11.9
typing 11.7
education 11.3
senior 11.3
telecommunication system 11
indoor 11
workplace 10.5
boy 10.4
piano 10.4
learning 10.3
stringed instrument 10.3
study 10.3
model 10.1
room 10.1
alone 10.1
relaxing 10
businessman 9.7
success 9.7
youth 9.4
hand 9.1
digital computer 9
human 9
interior 8.9
seated 8.8
couple 8.7
love 8.7
happiness 8.6
sofa 8.6
cute 8.6
men 8.6
wireless 8.6
keyboard instrument 8.5
leisure 8.3
clothing 8.2
school 8.1
family 8
kid 8
look 7.9
day 7.9
play 7.8
couch 7.7
read 7.7
only 7.6
resting 7.6
reading 7.6
businesspeople 7.6
mother 7.5
relaxed 7.5
fun 7.5
floor 7.4
phone 7.4
successful 7.3
blond 7.3
modern 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98.5
person 96.3
black and white 95.1
monochrome 70.2
clothing 68.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 25-35
Gender Female, 86.9%
Calm 65.4%
Happy 32.2%
Sad 1.1%
Angry 0.4%
Fear 0.3%
Disgusted 0.3%
Surprised 0.2%
Confused 0.1%

AWS Rekognition

Age 23-31
Gender Female, 99.9%
Calm 91.6%
Sad 7.9%
Disgusted 0.1%
Confused 0.1%
Happy 0.1%
Surprised 0.1%
Angry 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Gun 59.2%

Categories

Captions