Human Generated Data

Title

Untitled (family standing on steps of porch)

Date

1957

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8984

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (family standing on steps of porch)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.4
Person 99.2
Person 97.1
Person 95.7
Person 93.8
Wheel 92.8
Machine 92.8
Bicycle 89
Bike 89
Transportation 89
Vehicle 89
Clothing 81.3
Apparel 81.3
Wheel 78.6
Person 76.4
Person 69
Building 61
People 60.8
Chair 58.1
Furniture 58.1
Handrail 57.2
Banister 57.2

Imagga
created on 2022-01-09

case 63.1
interior 40.7
window 39.3
house 35.9
architecture 30
home 28.7
room 27.2
table 24.7
furniture 24.4
modern 22.4
decor 22.1
building 20.7
glass 20.3
indoors 20.2
design 18
luxury 18
inside 17.5
apartment 17.2
shelf 16.9
chair 16.8
indoor 16.4
kitchen 15.6
light 15.4
residential 15.3
counter 13.5
wall 13
wood 12.5
urban 12.2
floor 12.1
decoration 11.6
lamp 11.5
new 11.3
structure 11.2
city 10.8
cabinet 10.7
travel 10.6
sofa 10.5
contemporary 10.3
elegance 10.1
office 9.8
comfortable 9.5
hotel 9.5
estate 9.5
elegant 9.4
3d 9.3
restaurant 9.3
domestic 9
style 8.9
steel 8.8
dining 8.6
real 8.5
living 8.5
business 8.5
relaxation 8.4
people 8.4
sink 8.3
shop 8.3
tile 8.1
balcony 8
expensive 7.7
monument 7.5
tourism 7.4
seat 7.4
color 7.2
framework 7.2
work 7.1

Google
created on 2022-01-09

Building 93.4
Window 92
Black 89.7
Black-and-white 85.5
Style 83.8
Facade 75.8
Rectangle 75.8
Snapshot 74.3
Monochrome photography 73.8
Monochrome 72.8
Room 69.5
Art 68.6
House 67.5
Stock photography 63.6
Wheel 59.6
Glass 59.5
Visual arts 58.5
Suit 58.2
History 57.8
Street 54

Microsoft
created on 2022-01-09

black and white 88.8
text 87
person 80.1
house 55
window 17.4

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 88.8%
Calm 45.5%
Angry 23.6%
Surprised 8.4%
Happy 7.5%
Disgusted 7.1%
Confused 4.2%
Sad 2.5%
Fear 1.2%

AWS Rekognition

Age 47-53
Gender Female, 99.2%
Sad 59.2%
Happy 37%
Angry 1.3%
Calm 0.7%
Disgusted 0.6%
Surprised 0.4%
Fear 0.4%
Confused 0.4%

AWS Rekognition

Age 20-28
Gender Male, 71%
Sad 29.2%
Calm 20.7%
Disgusted 18.1%
Surprised 12.6%
Happy 11%
Confused 3.6%
Angry 3.6%
Fear 1.2%

AWS Rekognition

Age 24-34
Gender Female, 95.9%
Happy 41.7%
Calm 41.1%
Surprised 5.7%
Fear 3.2%
Disgusted 3%
Angry 2%
Confused 1.7%
Sad 1.6%

AWS Rekognition

Age 16-24
Gender Female, 97.9%
Happy 56.7%
Sad 15.9%
Calm 10.6%
Angry 5.8%
Surprised 4.7%
Disgusted 3.1%
Fear 2.3%
Confused 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Wheel 92.8%
Bicycle 89%

Captions

Microsoft

a group of men sitting in front of a window 42.4%
a group of men sitting in front of a store window 34.6%
a group of people sitting in front of a window 34.5%

Text analysis

Amazon

42463.
MJ17--YT37A--AX

Google

42463.
42463.