Human Generated Data

Title

Untitled (man with rifle standing infront of cargo door)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7162

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man with rifle standing infront of cargo door)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7162

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.2
Human 99.2
Airplane 92.3
Transportation 92.3
Vehicle 92.3
Aircraft 92.3
Clothing 73.4
Apparel 73.4
Face 68.6
Standing 67.6
Shorts 61.3
Mirror 61.2
Nature 57.4
Outdoors 57
Text 55.4

Clarifai
created on 2023-10-15

people 99.6
vehicle 98.3
aircraft 98.3
airplane 98.2
transportation system 97.8
monochrome 95.7
man 95.3
vehicle window 93.2
street 91.1
airport 88.8
one 86.8
adult 86.3
no person 85.4
travel 84.6
two 81.2
chair 80.6
military 78.8
car 75.8
group together 74.6
illustration 74.6

Imagga
created on 2021-12-15

car 29.6
transportation 29.6
vehicle 26.6
transport 23.7
device 23.6
travel 21.8
auto 21
automobile 20.1
drive 18
speed 16.5
road 16.2
equipment 14.9
driving 14.5
traffic 14.2
technology 14.1
fast 14
engine 13.5
business 13.4
modern 12.6
digital 12.1
electronic equipment 12
monitor 11.9
driver 11.6
sky 11.5
water 11.3
television 10.4
power 10.1
window 10
metal 9.6
highway 9.6
sea 9.4
light 9.3
aircraft 9.3
inside 9.2
ocean 9.1
high 8.7
wing 8.6
seat 8.6
wheel 8.6
people 8.4
cathode-ray tube 8.1
close 8
stereo 8
interior 8
design 7.9
black 7.8
motion 7.7
mirror 7.7
safe 7.5
sport 7.4
safety 7.4
container 7.2
machine 7.1

Microsoft
created on 2021-12-15

text 92.5
black and white 88.2
white 66.2
black 65.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-47
Gender Male, 98.8%
Calm 98.1%
Happy 0.4%
Sad 0.3%
Confused 0.3%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Airplane 92.3%

Categories

Captions

Text analysis

Amazon

20516A
20
20 SISA
SISA

Google

20516A
20516A