Human Generated Data

Title

Untitled (rephotographed portrait of woman sitting in rocking chair on porch)

Date

1935-1940, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10290

Human Generated Data

Title

Untitled (rephotographed portrait of woman sitting in rocking chair on porch)

People

Artist: Martin Schweig, American 20th century

Date

1935-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10290

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Furniture 99.8
Chair 99.8
Person 97.7
Human 97.7
Machine 72.4
Flooring 71.7
Clothing 64.2
Apparel 64.2
Coat 64.2
Overcoat 64.2
Suit 64.2
Banister 62.9
Handrail 62.9
Tire 58.7
Tuxedo 56.7
Door 56
Car Wheel 55.5
Wheel 55.5
Floor 55.4
Person 45.9

Clarifai
created on 2019-11-16

people 99
monochrome 96.5
street 95.9
room 93.6
window 93.5
one 93.4
adult 91.7
man 87.6
furniture 86.7
indoors 86.3
vehicle 85.3
analogue 84.8
music 84.7
no person 83.6
group 83.3
portrait 82.5
woman 80
wear 79.8
two 79.6
train 79.2

Imagga
created on 2019-11-16

monitor 57.3
electronic equipment 50.4
equipment 49.8
computer 23.8
technology 23.7
window 22.4
interior 18.6
device 17.9
business 15.8
architecture 14.8
working 14.1
modern 14
house 13.4
office 13.1
room 12.7
work 12.5
man 12.1
black 11.4
industry 11.1
network 11
connection 11
data 10.9
television 10.9
building 10.8
engineer 10.8
support 10.8
light 10.7
steel 10.6
hardware 10.6
indoors 10.5
home 10.4
server 10.3
inside 10.1
glass 10.1
cable 10
metal 9.6
urban 9.6
chair 9.6
design 9.6
engineering 9.5
center 9.3
power 9.2
people 8.9
worker 8.9
desk 8.7
repair 8.6
wall 8.5
table 8.5
city 8.3
desktop computer 8.3
security 8.3
digital 8.1
machine 7.9
personal computer 7.8
male 7.8
3d 7.7
problem 7.7
system 7.6
electricity 7.5
information 7.1
radio 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 96.9
piano 96.8
black and white 94.9
window 86.2
white 83.8
black 71.4
monochrome 62.2
furniture 58.3
musical keyboard 51.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 25-39
Gender Male, 53.6%
Confused 45.1%
Angry 45.1%
Surprised 45.2%
Calm 53.5%
Disgusted 45.1%
Fear 45.1%
Sad 45.1%
Happy 45.8%

AWS Rekognition

Age 61-77
Gender Male, 54.4%
Calm 50.7%
Happy 45.1%
Angry 45.7%
Disgusted 45.8%
Surprised 45.1%
Fear 45.1%
Sad 47.4%
Confused 45.2%

Feature analysis

Amazon

Person 97.7%

Categories

Imagga

interior objects 98.8%