Human Generated Data

Title

Untitled (woman playing piano for crowd in front of H.J. Heinz Co. Building)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4409

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (woman playing piano for crowd in front of H.J. Heinz Co. Building)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4409

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 99.1
Audience 99.1
Crowd 99.1
Person 96.9
Person 86.4
Person 83.3
Interior Design 82.5
Indoors 82.5
Person 77.5
Architecture 75.6
Building 75.6
Person 72.9
Pillar 71.7
Column 71.7
Person 67.1
Nature 63.3
Lighting 61.3
Person 60.3
Lecture 57.1
Speech 57.1
Concert 55.1
Person 54
Person 50.6
Person 49.5

Clarifai
created on 2023-10-26

people 99.5
many 98.7
group 98.2
man 97.3
administration 96.9
adult 95.8
monochrome 93.8
group together 92
woman 91.2
leader 90.2
no person 89.9
crowd 88.7
architecture 86.5
building 86.1
war 83.3
chair 82
several 79.7
outdoors 76.2
child 76.1
street 74.7

Imagga
created on 2022-01-23

hall 42.8
city 41.5
building 38.8
architecture 34.3
urban 26.2
night 25.7
sky 24.2
classroom 24
center 23.3
travel 23.2
shop 22
room 20.8
bakery 20.6
mercantile establishment 20
buildings 18.9
tourism 17.3
bridge 17
cityscape 16.1
landmark 15.3
skyline 15.2
river 15.1
structure 14.5
water 14
town 13.9
famous 13
light 12.7
modern 12.6
place of business 12.5
panorama 12.4
office 12.2
lights 12
street 12
supermarket 11.9
tower 11.6
downtown 11.5
landscape 11.1
house 11
clouds 11
business 10.9
old 10.4
scene 10.4
monument 10.3
evening 10.3
tourist 10.1
grocery store 9.6
port 9.6
roof 9.6
skyscraper 9.6
destination 9.3
transportation 9
history 8.9
windows 8.6
sea 8.6
illuminated 8.6
construction 8.6
transport 8.2
facade 8.1
new 8.1
district 7.8
houses 7.7
dusk 7.6
dark 7.5
boat 7.4
reflection 7.3
station 7.3
industrial 7.3
people 7.2
marketplace 7.2
road 7.2
coast 7.2
home 7.2

Google
created on 2022-01-23

Photograph 94.3
Window 91.4
Building 91.1
Black 89.8
Black-and-white 86.7
Style 84
Line 82.2
Font 81.1
Adaptation 79.4
Monochrome 78.8
Monochrome photography 78
Snapshot 74.3
Event 72.6
Art 69.8
Facade 69.6
Crowd 67.5
Stock photography 66.3
Rectangle 65.9
Room 65.6
History 65.4

Microsoft
created on 2022-01-23

person 89
text 85.5
black 83.2
people 81.9
group 75.1
clothing 65.1
man 56.7
old 54.6
crowd 1.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 94.2%
Happy 40.2%
Sad 31.9%
Angry 11%
Calm 5.1%
Disgusted 4.8%
Surprised 3.4%
Fear 2.8%
Confused 1%

AWS Rekognition

Age 20-28
Gender Female, 74.2%
Happy 39.8%
Sad 25.1%
Calm 20.2%
Disgusted 4.6%
Confused 3.1%
Angry 2.9%
Fear 2.4%
Surprised 2%

AWS Rekognition

Age 30-40
Gender Male, 74.1%
Calm 95%
Sad 3.1%
Happy 0.5%
Confused 0.4%
Surprised 0.3%
Fear 0.2%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 35-43
Gender Female, 63.6%
Happy 97.8%
Sad 0.8%
Calm 0.7%
Disgusted 0.2%
Surprised 0.2%
Angry 0.1%
Fear 0.1%
Confused 0.1%

AWS Rekognition

Age 25-35
Gender Male, 89.1%
Calm 76.5%
Sad 9%
Happy 3.9%
Fear 3.5%
Disgusted 3.4%
Confused 1.9%
Surprised 1.1%
Angry 0.7%

AWS Rekognition

Age 22-30
Gender Male, 89.2%
Calm 59.8%
Happy 27.6%
Sad 9.2%
Surprised 1%
Angry 0.8%
Disgusted 0.7%
Confused 0.5%
Fear 0.4%

AWS Rekognition

Age 22-30
Gender Female, 94.4%
Calm 94.6%
Sad 3.9%
Surprised 0.4%
Happy 0.3%
Angry 0.3%
Fear 0.2%
Disgusted 0.2%
Confused 0.1%

AWS Rekognition

Age 23-33
Gender Female, 69.4%
Calm 71.1%
Sad 10.2%
Angry 5.1%
Surprised 3.8%
Disgusted 3%
Happy 2.8%
Confused 2.4%
Fear 1.7%

AWS Rekognition

Age 22-30
Gender Male, 99.5%
Calm 53.4%
Happy 34%
Fear 4.3%
Sad 4%
Angry 1.8%
Disgusted 1%
Confused 0.8%
Surprised 0.6%

AWS Rekognition

Age 21-29
Gender Female, 80.5%
Angry 58.5%
Happy 24.9%
Surprised 7%
Calm 5.1%
Fear 1.4%
Disgusted 1.2%
Sad 1%
Confused 0.8%

AWS Rekognition

Age 21-29
Gender Female, 83.7%
Calm 99.3%
Happy 0.3%
Surprised 0.1%
Sad 0.1%
Disgusted 0%
Fear 0%
Angry 0%
Confused 0%

AWS Rekognition

Age 14-22
Gender Male, 54.5%
Calm 91.5%
Sad 4.1%
Disgusted 1.1%
Angry 0.8%
Fear 0.7%
Confused 0.6%
Happy 0.6%
Surprised 0.6%

AWS Rekognition

Age 28-38
Gender Female, 89%
Calm 96.8%
Sad 1.9%
Happy 0.4%
Confused 0.3%
Surprised 0.3%
Angry 0.2%
Disgusted 0.2%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories

Text analysis

Amazon

HEINZ
CO.
J.
H. J. HEINZ CO.
17237.
H.
ЛАСОЖ
YT37A2

Google

37. CO. H. J. HEINZ 17237.
37.
CO.
H.
J.
HEINZ
17237.