Human Generated Data

Title

Untitled (University of Pennsylvania students wearing swim suits and posing on rooftop)

Date

1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8394

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (University of Pennsylvania students wearing swim suits and posing on rooftop)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8394

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 95.3
Human 95.3
Person 95
Machine 89.8
Person 88.9
Person 88.4
Person 86.5
Spoke 81.1
Person 78.8
Vehicle 76.4
Transportation 76.4
Wheel 76
Face 71.8
Person 71.5
Clothing 71.4
Apparel 71.4
Tire 70.2
Person 69.1
People 64.7
Motor 62.2
Shorts 62
Car Wheel 61.5
Car 60.3
Automobile 60.3
Alloy Wheel 59.7
Tarmac 58.7
Asphalt 58.7
Motorcycle 58.4
Tree 56.2
Plant 56.2
Person 53.8
Person 42.2

Clarifai
created on 2023-10-25

people 99.9
war 98.7
military 98.6
group together 98.5
many 97.4
soldier 97.1
group 96.8
man 96.5
adult 94.9
uniform 88.2
weapon 87.4
administration 85.7
vehicle 85.7
education 85.5
woman 85.4
child 85.1
skirmish 84.6
several 84.3
outfit 83.7
leader 80.6

Imagga
created on 2022-01-09

city 18.3
architecture 17.5
building 16.7
structure 16.5
sculpture 16.3
travel 15.5
statue 14.7
urban 12.2
memorial 11
history 10.7
automaton 10.3
monument 10.3
sky 10.2
tourism 9.9
vacation 9.8
landmark 9
brass 9
device 8.9
metal 8.8
construction 8.6
room 8.4
modern 8.4
vehicle 8.4
famous 8.4
house 8.4
street 8.3
vintage 8.3
transportation 8.1
tower 8.1
old 7.7
outdoor 7.6
fountain 7.6
stone 7.6
art 7.6
outdoors 7.5
tourist 7.4
car 7.3
business 7.3
lifestyle 7.2
road 7.2
work 7.2
person 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 98.6
outdoor 94.2
black and white 92.4
waste container 83.5
house 77.7
window 74.7
street 58

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 39-47
Gender Female, 54.8%
Calm 79.6%
Happy 12.3%
Sad 4%
Surprised 2.9%
Disgusted 0.5%
Confused 0.3%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 54-62
Gender Female, 99.3%
Happy 79.5%
Calm 12.2%
Surprised 7.4%
Disgusted 0.3%
Fear 0.2%
Angry 0.2%
Sad 0.1%
Confused 0.1%

AWS Rekognition

Age 42-50
Gender Male, 99.1%
Confused 43.7%
Calm 39.9%
Disgusted 7.1%
Angry 2.9%
Surprised 2.6%
Sad 2%
Happy 1.5%
Fear 0.4%

AWS Rekognition

Age 45-51
Gender Male, 95.2%
Calm 99.9%
Confused 0%
Surprised 0%
Disgusted 0%
Sad 0%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 29-39
Gender Female, 97.8%
Happy 84.5%
Calm 4.3%
Sad 4.1%
Fear 2.1%
Disgusted 2.1%
Surprised 1.5%
Angry 0.7%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.3%

Categories

Text analysis

Amazon

F.
12563 F.
12503F.
12563
RO
942
NAVTO
VT3RA2
edition

Google

12 503 F. PN RRY E 1942 2563 F.
12
503
F.
PN
RRY
E
1942
2563