Human Generated Data

Title

Untitled (men unloading cargo from airplane, Olmstead Airfield, Pennsylvania)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11784

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men unloading cargo from airplane, Olmstead Airfield, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11784

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.7
Human 99.7
Person 95.8
Person 93.1
Furniture 78.9
Person 70.5
Building 64.9
LCD Screen 63
Electronics 63
Screen 63
Monitor 63
Display 63
Person 61.4
Airplane 60.2
Transportation 60.2
Vehicle 60.2
Aircraft 60.2
Architecture 55

Clarifai
created on 2023-10-25

people 99.2
monochrome 97.2
art 95.2
man 94.6
music 94.6
street 93
light 91.6
piano 91
silhouette 91
woman 91
adult 90.7
indoors 89.2
chair 88.2
concert 87.6
dark 87.5
seat 86.5
group 84
shadow 83.9
furniture 83.3
audience 83.3

Imagga
created on 2022-01-15

technology 22.3
man 21.5
equipment 21.1
computer 21
business 20
monitor 19.4
black 18
office 17.4
working 16.8
barbershop 14.9
male 14.2
people 13.9
work 13.3
modern 13.3
electronic equipment 13.2
digital 13
laptop 12.6
network 12.5
shop 12.5
device 12.3
display 12
person 12
professional 11.9
chair 11.5
light 10.7
information 10.6
center 10.2
connection 10
worker 9.8
job 9.7
desk 9.3
inside 9.2
occupation 9.2
mercantile establishment 9.1
data 9.1
indoors 8.8
room 8.6
men 8.6
television 8.5
adult 8.4
communication 8.4
hand 8.4
window 8.3
music 8.1
furniture 8.1
interior 8
art 7.9
table 7.7
industry 7.7
cable 7.6
silhouette 7.4
back 7.3
call 7.3
metal 7.2
science 7.1
architecture 7

Microsoft
created on 2022-01-15

text 99.9
indoor 90
black and white 89.3
concert 88.2
white 68.4
person 68.3
ship 63.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 27-37
Gender Male, 99.4%
Happy 76.2%
Calm 14.2%
Disgusted 3.4%
Fear 2.3%
Sad 1.7%
Confused 1%
Surprised 0.8%
Angry 0.4%

AWS Rekognition

Age 26-36
Gender Female, 73.1%
Calm 63.6%
Happy 14.5%
Fear 5.7%
Sad 4.5%
Surprised 4%
Disgusted 3.6%
Confused 2.5%
Angry 1.6%

Feature analysis

Amazon

Person 99.7%
Airplane 60.2%

Categories

Text analysis

Amazon

20503.
E0502
25

Google

E1人一KOVK
E1
KOVK