Human Generated Data

Title

Untitled (woman playing piano, Heinz Ocean Pier)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8378

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing piano, Heinz Ocean Pier)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8378

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 97.7
Human 97.7
Person 91.9
Military 80.1
Military Uniform 77.2
Leisure Activities 67.7
Clothing 65.9
Apparel 65.9
Photo 64.1
Portrait 64.1
Face 64.1
Photography 64.1
Furniture 63.6
Person 62.2
Armored 60.3
Army 60.3
Musician 59.9
Musical Instrument 59.9
Girl 59.1
Female 59.1
Soldier 57.3
Chair 56.3
Screen 55.2
Monitor 55.2
Electronics 55.2
Display 55.2

Clarifai
created on 2023-10-25

people 99.7
music 99.1
adult 98.8
musician 98.1
two 97.8
man 97.4
piano 97.3
instrument 97.3
woman 96.6
one 96.3
group 94.1
furniture 91.9
wear 91.3
monochrome 90.1
singer 89.7
rehearsal 88.7
chair 88.5
actress 86.4
jazz 86
three 85.9

Imagga
created on 2022-01-09

old 17.4
building 16.7
structure 15.2
architecture 15.2
city 14.1
business 13.4
house 12.5
history 12.5
vintage 11.6
metal 10.5
ancient 10.4
retro 9.8
device 9.7
urban 9.6
home 9.6
design 9.6
construction 9.4
paper 9.4
art 8.6
office 8.6
stock 8.4
black 8.4
modern 8.4
dark 8.3
historic 8.2
cash 8.2
work 7.8
people 7.8
travel 7.7
billboard 7.7
money 7.7
finance 7.6
statue 7.5
room 7.5
monument 7.5
light 7.3
new 7.3
steel 7.1
working 7.1

Microsoft
created on 2022-01-09

text 98.6
black and white 96.1
person 76.9
monochrome 73.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 83.1%
Calm 83.9%
Surprised 7.1%
Sad 3.1%
Confused 2.6%
Disgusted 1.9%
Happy 0.6%
Angry 0.5%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.7%

Categories

Captions

Microsoft
created on 2022-01-09

a group of people standing in front of a bus 27.9%

Text analysis

Amazon

17241.
XAGOX