Human Generated Data

Title

Untitled (group of people sitting outside of Lazarus Coffee Shop, Tarpon Springs)

Date

1947

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5551

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (group of people sitting outside of Lazarus Coffee Shop, Tarpon Springs)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.7
Human 99.7
Person 98.2
Person 97.2
Person 94.9
Person 93.9
Person 92.4
People 90.4
Person 89.7
Person 86.6
Person 85.2
Housing 83.6
Building 83.6
Person 82.5
Person 82
Person 80
Person 78.7
Shoe 77.5
Clothing 77.5
Footwear 77.5
Apparel 77.5
Person 76
Person 73.5
Person 69.3
Porch 64.8
House 63.5
Mansion 63.4
Family 60.6
Text 57.7
Person 50.9
Person 50.1
Person 42.1

Imagga
created on 2022-01-23

sky 20.4
building 16.7
travel 16.2
structure 15.4
summer 12.8
people 12.8
beach 12.8
sea 12.6
school 12.6
water 12
city 11.6
old 11.1
room 11
island 11
coast 10.8
transportation 10.8
tourism 10.7
outdoor 10.7
sand 10.6
architecture 10.3
park 10
ocean 9.9
landscape 9.7
holiday 9.3
black 9
world 9
vacation 9
sun 8.8
scene 8.6
tree 8.5
clouds 8.4
field 8.4
transport 8.2
hall 8.1
man 8.1
newspaper 8
day 7.8
sunny 7.7
house 7.6
coastline 7.5
life 7.4
tourist 7.4
speed 7.3
business 7.3
person 7.2

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 99.6
person 92.7
clothing 88.8
people 83.4
woman 66.7
white 64
wedding dress 61.7
old 54.9
man 52.3

Face analysis

Amazon

Google

AWS Rekognition

Age 24-34
Gender Female, 89.9%
Sad 53.7%
Happy 23.2%
Calm 7.8%
Confused 5.5%
Surprised 4.5%
Angry 3.3%
Disgusted 1.3%
Fear 0.7%

AWS Rekognition

Age 23-33
Gender Female, 80.9%
Calm 74.7%
Confused 11.4%
Sad 4.9%
Happy 4.1%
Angry 1.8%
Surprised 1.1%
Disgusted 1.1%
Fear 0.9%

AWS Rekognition

Age 37-45
Gender Female, 55.3%
Happy 67.3%
Surprised 11.1%
Sad 6.6%
Calm 6.2%
Confused 4.9%
Fear 2%
Angry 0.9%
Disgusted 0.9%

AWS Rekognition

Age 28-38
Gender Male, 93.8%
Happy 65.5%
Calm 18.3%
Sad 7.7%
Disgusted 2%
Fear 1.9%
Confused 1.7%
Angry 1.7%
Surprised 1.1%

AWS Rekognition

Age 26-36
Gender Female, 64.1%
Fear 65.7%
Calm 15.6%
Happy 11.2%
Surprised 3.3%
Disgusted 1.4%
Sad 1.4%
Angry 1%
Confused 0.3%

AWS Rekognition

Age 23-31
Gender Male, 71.8%
Calm 70.8%
Happy 8.5%
Surprised 7.8%
Sad 4.5%
Angry 3.5%
Fear 2.4%
Disgusted 1.6%
Confused 0.8%

AWS Rekognition

Age 35-43
Gender Female, 95.8%
Happy 98.2%
Fear 0.5%
Sad 0.5%
Calm 0.3%
Confused 0.2%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 21-29
Gender Male, 97.8%
Calm 58.9%
Fear 24.7%
Happy 4.6%
Sad 3.4%
Confused 3.2%
Disgusted 2%
Angry 1.7%
Surprised 1.5%

AWS Rekognition

Age 29-39
Gender Female, 50.5%
Sad 51.2%
Happy 23.6%
Calm 10.5%
Confused 7.1%
Disgusted 3.5%
Angry 1.7%
Surprised 1.4%
Fear 1.1%

AWS Rekognition

Age 35-43
Gender Male, 89.8%
Calm 59.7%
Sad 12.1%
Fear 8.1%
Confused 6%
Angry 5.1%
Disgusted 4.3%
Surprised 3.2%
Happy 1.6%

AWS Rekognition

Age 23-31
Gender Male, 77.7%
Happy 57.7%
Calm 20%
Sad 8.5%
Confused 5%
Disgusted 4.7%
Fear 1.7%
Angry 1.2%
Surprised 1.1%

AWS Rekognition

Age 33-41
Gender Male, 90.1%
Calm 96.1%
Sad 1.1%
Surprised 1%
Fear 0.6%
Angry 0.5%
Happy 0.3%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.7%
Shoe 77.5%

Captions

Microsoft

a group of people standing in front of a building 82.9%
a group of people in an old photo of a person 71.6%
a group of people standing outside of a building 71.5%

Text analysis

Amazon

KEY,
SIESTA
J.
RO
J. J. STEINMETZ, SIESTA KEY, SARASOTA,-FLA.
STEINMETZ,
SARASOTA,-FLA.
22945A
YT3RAS
COF

Google

J.
STEINMETZ,
SIESTA
SARASOTA,
22945
FLA.
22945 A J. J. STEINMETZ, SIESTA KEY, SARASOTA, FLA.
A
KEY,