Human Generated Data

Title

IRT Interior, New York City

Date

1947

People

Artist: N. Jay Jaffee, American 1921 - 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of the Artist, P1998.108

Copyright

© The N. Jay Jaffee Trust. All rights reserved. Used by permission. www.njayjaffee.com

Human Generated Data

Title

IRT Interior, New York City

People

Artist: N. Jay Jaffee, American 1921 - 1999

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Transportation 99.9
Train Station 99.9
Train 99.9
Terminal 99.9
Vehicle 99.9
Subway 97.3
Person 96.5
Human 96.5
Person 90.7
Person 85.5
Person 83.5
Person 69.2

Imagga
created on 2021-12-14

architecture 43
building 40.9
old 34.8
city 34.1
window 26.3
street 25.8
wall 25
house 24.2
urban 23.6
door 22.7
device 21.7
travel 19.7
elevator 19
brick 18.8
stone 18.5
ancient 18.2
historic 17.4
structure 16.9
home 16.7
town 16.7
tourism 15.7
prison 15.6
lifting device 15.3
history 15.2
buildings 15.1
sky 14
light 13.4
interior 13.3
construction 12.8
wood 12.5
windows 12.5
gate 12
exterior 12
conveyance 11.6
vintage 11.6
call 11.5
correctional institution 11
roof 10.5
historical 10.3
tramway 10.1
glass 10.1
landmark 9.9
movable barrier 9.9
black 9.7
warehouse 9.6
scene 9.5
grunge 9.4
shop 9.4
facade 9.1
locker 9.1
aged 9
dirty 9
road 9
metal 8.8
abandoned 8.8
empty 8.6
sliding door 8.5
wheeled vehicle 8.3
bridge 8.3
penal institution 8.2
outdoors 8.2
tourist 8.2
passenger car 8.1
reflection 8.1
barrier 8.1
car 8.1
tower 8.1
beam 8
turnstile 7.9
wooden 7.9
cobblestone 7.9
fastener 7.9
lamp 7.8
antique 7.8
arch 7.8
houses 7.7
broken 7.7
outdoor 7.6
destination 7.5
institution 7.5
vacation 7.4
transportation 7.2
office 7.2
steel 7.1
modern 7

Microsoft
created on 2021-12-14

text 99.8
street 98.1
black and white 96.5
monochrome 81.5
person 76
train 75.8
clothing 73.2
white 71.4
city 68.5
door 63.2
open 41.7
opened 12.9

Face analysis

Amazon

Google

AWS Rekognition

Age 18-30
Gender Female, 94.6%
Sad 74.3%
Calm 22%
Fear 2.5%
Confused 0.6%
Happy 0.2%
Disgusted 0.2%
Surprised 0.2%
Angry 0.1%

AWS Rekognition

Age 25-39
Gender Male, 87.3%
Sad 67.4%
Calm 18.5%
Happy 9.8%
Angry 2.2%
Confused 1%
Surprised 0.7%
Fear 0.3%
Disgusted 0.2%

AWS Rekognition

Age 23-35
Gender Male, 98.5%
Calm 80.1%
Angry 10.3%
Surprised 3.6%
Sad 2.1%
Confused 1.9%
Fear 0.8%
Happy 0.6%
Disgusted 0.6%

AWS Rekognition

Age 46-64
Gender Male, 55.8%
Angry 37.5%
Sad 16.2%
Happy 13.4%
Confused 10.7%
Calm 7.5%
Fear 6.7%
Disgusted 5.9%
Surprised 2.1%

AWS Rekognition

Age 17-29
Gender Male, 97.5%
Sad 48.1%
Calm 43.1%
Angry 3%
Happy 2.4%
Fear 1.4%
Confused 1.2%
Disgusted 0.5%
Surprised 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.5%

Captions

Microsoft

a group of people standing in front of a window 74%
a group of people standing in front of a store window 66.4%
a group of people standing in front of a door 66.3%

Text analysis

Amazon

Jaster
usay Jaster
usay