Human Generated Data

Title

Prefabricated Copper Houses, 1931-1932

Date

c. 1932

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of the Architect's Collaborative, BRGA.57.292

Human Generated Data

Title

Prefabricated Copper Houses, 1931-1932

People

Artist: Unidentified Artist,

Date

c. 1932

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Human 99.7
Person 99.7
Person 99.4
Person 99.3
Person 99.1
Person 99.1
Clothing 90.1
Apparel 90.1
Nature 76.4
Overcoat 71.3
Coat 71.3
Suit 70.6
Outdoors 68.3
Building 67.7
Standing 66.6
Wall 63.5
Pedestrian 61.8
Art 61.3
Silhouette 57.7
Advertisement 56.4
Countryside 56.3
Housing 55.9

Imagga
created on 2022-03-11

sliding door 88.6
door 77.6
movable barrier 53.7
barrier 36.3
building 24.2
architecture 23.5
street 23
travel 21.1
city 20.8
old 19.5
obstruction 18.4
house 18.4
refrigerator 15.4
room 15.4
transportation 15.2
wall 15
urban 14
pay-phone 13.9
light 13.4
window 13
home 12.8
telephone 12.7
wood 12.5
interior 12.4
stone 12.1
train 12
white goods 11.8
corridor 11.8
empty 11.2
road 10.8
indoors 10.5
sidewalk 10.4
town 10.2
passage 10.1
transport 10
industrial 10
tourism 9.9
station 9.9
modern 9.8
line 9.4
construction 9.4
man 9.4
industry 9.4
glass 9.3
dark 9.2
history 8.9
home appliance 8.9
sky 8.9
electronic equipment 8.9
doors 8.8
rail 8.8
entrance 8.7
structure 8.6
tourist 8.2
equipment 8
railroad 7.9
railway 7.8
scene 7.8
windows 7.7
office 7.5
place 7.4
historic 7.3
metal 7.2
prison 7.1

Google
created on 2022-03-11

Building 89.3
Door 88
Rectangle 88
Black-and-white 85.9
Gesture 85.3
Style 84
Plant 83.9
Wall 82.3
Cloud 78.7
Tints and shades 77.3
Monochrome 76.6
Sky 76.4
Monochrome photography 76
Art 72.6
Facade 70.8
Suit 70.7
Tree 70.5
Room 68.8
Landscape 66.5
Stock photography 62.8

Microsoft
created on 2022-03-11

black and white 97.2
street 93.1
scene 93
outdoor 90.7
text 90.3
monochrome 82
gallery 80.3
man 78.8
person 75.5
way 67
house 55.7
room 42.8

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 96.6%
Fear 45%
Calm 33.5%
Happy 12.5%
Angry 5.5%
Surprised 2%
Sad 0.7%
Disgusted 0.6%
Confused 0.1%

AWS Rekognition

Age 34-42
Gender Female, 51.5%
Calm 78.4%
Surprised 10%
Sad 4.9%
Angry 2.9%
Confused 2%
Disgusted 1%
Happy 0.4%
Fear 0.3%

AWS Rekognition

Age 21-29
Gender Male, 52.7%
Surprised 28.8%
Disgusted 24.8%
Calm 23%
Fear 9.1%
Confused 8.3%
Happy 2.3%
Angry 1.8%
Sad 1.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a man standing in front of a window 84.2%
a man that is standing in front of a window 82.3%
a man standing next to a window 79%

Text analysis

Amazon

19
RAA

Google

19
19