Human Generated Data

Title

Untitled (nurse with man donating blood)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7635

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (nurse with man donating blood)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7635

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.5
Person 99.5
Footwear 93.2
Clothing 93.2
Apparel 93.2
Shoe 93.2
Furniture 85.3
Person 84.4
Person 70.3
Face 69.9
Shoe 67.6
Text 65.7
Table 65
Vehicle 62.1
Transportation 62.1
Bed 60.3
Coat 58.9
Suit 58.9
Overcoat 58.9
Finger 57.7

Clarifai
created on 2023-10-25

people 99.4
street 94.8
group together 94
monochrome 94
man 93.8
airplane 92.9
group 92.9
adult 92.2
transportation system 91.4
vehicle 89.4
war 87.9
military 87.7
aircraft 84.8
car 84.7
woman 84.6
child 83.9
art 81.2
boy 79.4
one 78
many 77.9

Imagga
created on 2022-01-08

cockpit 75.6
seat 39.3
car 31
support 30.2
device 27.6
vehicle 27.2
man 24.2
transportation 21.5
automobile 21.1
work 19.6
person 17.5
transport 16.4
technology 16.3
repair 16.3
auto 15.3
worker 15.1
business 14.6
people 14.5
professional 14.4
adult 14.3
job 14.1
sitting 13.7
driving 13.5
male 13.5
equipment 13.3
working 13.2
machine 13
occupation 11.9
mechanic 11.7
engine 11.5
black 11.5
industry 11.1
driver 10.7
metal 10.5
safety 10.1
garage 9.8
motor 9.8
businessman 9.7
shop 9.7
drive 9.4
laptop 9.3
service 9.3
tool 9.2
industrial 9.1
road 9
computer 8.8
indoors 8.8
wheel 8.5
holding 8.2
flight simulator 8.2
speed 8.2
protection 8.2
light 8
smiling 8
mechanical 7.8
employee 7.6
power 7.6
briefcase 7.5
music 7.2
portrait 7.1
steel 7.1
part 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 95.6
black and white 93.8
monochrome 62.9
drawing 58.7
clothes 19.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 45-51
Gender Female, 53.5%
Calm 89.7%
Sad 6.5%
Surprised 1.9%
Happy 0.8%
Confused 0.4%
Disgusted 0.4%
Fear 0.1%
Angry 0.1%

Feature analysis

Amazon

Person 99.5%
Shoe 93.2%

Captions

Text analysis

Amazon

as
20965B
VI
209653.
VI ممعد
NAGOY
KAMTPAS
ممعد

Google

209
53
209 6 53
6