Human Generated Data

Title

Untitled (New York City Reformatory, New Hampton, New York)

Date

May 1934-June 1934

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.44

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (New York City Reformatory, New Hampton, New York)

People

Artist: Ben Shahn, American 1898 - 1969

Date

May 1934-June 1934

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.44

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2022-04-15

Person 99.8
Human 99.8
Machine 95.1
Person 87.5
Sewing 82.1
Person 81.8
Person 73
Worker 71.3
Spoke 66.5
Building 62.5
Sitting 57.7
Motor 55.9

Clarifai
created on 2023-10-14

people 99.9
monochrome 99.3
one 98.5
adult 98.4
man 98
train 96.3
two 96.2
street 95.2
portrait 94.1
concentration 92.8
transportation system 92.4
wear 90.4
sit 90
locomotive 88.8
vehicle 88.6
woman 86.6
actor 84.7
boy 84
three 83.2
indoors 82.7

Imagga
created on 2022-04-15

brass 64.1
wind instrument 53.8
man 39.6
musical instrument 35.7
male 34.3
cornet 33.9
people 29.6
person 25.1
computer 21.7
indoors 21.1
adult 20.2
device 20.2
home 19.1
horn 18.9
men 18
working 17.7
laptop 17.6
smiling 17.4
oboe 16.9
music 16.7
business 16.4
bassoon 16.2
work 15.7
happy 15.7
sitting 14.6
playing 13.7
musician 13.6
portrait 13.6
couple 13.1
play 12.9
black 12.6
child 12.4
equipment 12.4
interior 12.4
lifestyle 12.3
instrument 12.2
office 12.2
room 12.1
table 12.1
happiness 11.8
worker 11.5
smile 11.4
keyboard 11.4
together 10.5
senior 10.3
holding 9.9
instrumentality 9.9
family 9.8
cheerful 9.8
job 9.7
businessman 9.7
group 9.7
professional 9.7
love 9.5
meeting 9.4
occupation 9.2
musical 8.6
guy 8.5
communication 8.4
old 8.4
hand 8.4
woodwind 8
sax 8
looking 8
hands 7.8
face 7.8
education 7.8
two people 7.8
talking 7.6
piano 7.4
alone 7.3
color 7.2
handsome 7.1
women 7.1
restaurant 7.1

Google
created on 2022-04-15

Microsoft
created on 2022-04-15

person 98.9
man 94.8
text 61.4
black and white 58.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Female, 52.8%
Sad 99.7%
Confused 0.1%
Calm 0.1%
Fear 0%
Happy 0%
Disgusted 0%
Angry 0%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Categories

Captions

Microsoft
created on 2022-04-15

a man sitting on a table 74.3%
a man sitting at a table 74.2%
a man sitting in a car 48.4%