Human Generated Data

Title

Prefabricated Copper Houses, 1931-1932

Date

c. 1932

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of the Architect's Collaborative, BRGA.57.290

Human Generated Data

Title

Prefabricated Copper Houses, 1931-1932

People

Artist: Unidentified Artist,

Date

c. 1932

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of the Architect's Collaborative, BRGA.57.290

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.6
Human 99.6
Person 99.4
Person 99.2
Person 99.2
Person 97.7
Collage 96.6
Poster 96.6
Advertisement 96.6
Clothing 79.6
Apparel 79.6
Pedestrian 76.5
Overcoat 63.1
Coat 63.1
Flyer 62.2
Brochure 62.2
Paper 62.2
Suit 58.2
People 56.6
Silhouette 55.9
Corridor 55.1

Clarifai
created on 2023-10-22

monochrome 99.8
people 99
street 98.3
man 95.5
architecture 94
woman 93.2
black and white 92.6
family 91.3
city 90.7
window 90.2
adult 90.1
silhouette 89.9
travel 87.8
square 87.1
boat 87.1
sea 86.1
child 86
reflection 85.7
art 85.1
house 85

Imagga
created on 2022-03-11

monitor 100
electronic equipment 76.1
equipment 63.5
computer 42.7
technology 32.6
business 23.1
work 22.8
desktop computer 22.5
office 22.5
3d 20.1
personal computer 18.7
working 17.7
modern 17.5
laptop 17.1
desk 16.6
display 16.5
home 15.9
room 15.5
window 15.3
keyboard 15
screen 14.4
architecture 14.1
digital computer 14
furniture 13.6
house 13.4
interior 13.3
building 13.1
data 12.8
digital 11.3
three dimensional 11.2
device 11.1
center 10.8
information 10.6
metal 10.5
network 10.3
paper 10.2
design 10.1
machine 10
television 9.9
hardware 9.6
render 9.5
electronics 9.5
table 9.4
money 9.4
glass 9.3
people 8.9
printer 8.9
worker 8.9
door 8.8
light 8.7
apartment 8.6
support 8.6
wall 8.5
industry 8.5
chair 8.5
finance 8.4
electronic 8.4
phone 8.3
wealth 8.1
kitchen 8
notebook 7.9
computers 7.8
desktop 7.7
communication 7.6
service 7.4
security 7.3
occupation 7.3
object 7.3
steel 7.1

Google
created on 2022-03-11

Microsoft
created on 2022-03-11

text 97.9
black and white 96.4
house 93.9
monochrome 78.4
street 59.7
person 57.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-24
Gender Male, 69.2%
Calm 99.5%
Fear 0.4%
Surprised 0.1%
Happy 0%
Sad 0%
Angry 0%
Disgusted 0%
Confused 0%

AWS Rekognition

Age 34-42
Gender Male, 96.1%
Calm 78.1%
Angry 20.2%
Sad 0.5%
Happy 0.4%
Surprised 0.3%
Confused 0.2%
Fear 0.2%
Disgusted 0.1%

AWS Rekognition

Age 20-28
Gender Male, 79.7%
Fear 35.2%
Sad 31%
Calm 16.7%
Angry 5.1%
Surprised 4.9%
Disgusted 4.2%
Happy 1.7%
Confused 1.2%

AWS Rekognition

Age 19-27
Gender Female, 97.7%
Calm 37.7%
Disgusted 22.9%
Angry 11.8%
Surprised 7.8%
Fear 6.9%
Happy 6.1%
Sad 4.5%
Confused 2.3%

Feature analysis

Amazon

Person
Person 99.6%
Person 99.4%
Person 99.2%
Person 99.2%
Person 97.7%

Categories

Imagga

paintings art 98.8%

Captions

Microsoft
created on 2022-03-11

graphical user interface 49.8%