Human Generated Data

Title

Untitled (three photographs: woman tossing wedding bouquet; man at open car door; two men with clothing boxes on bed)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12982

Human Generated Data

Title

Untitled (three photographs: woman tossing wedding bouquet; man at open car door; two men with clothing boxes on bed)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12982

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.2
Horse 72.5
Animal 72.5
Mammal 72.5
Person 62.9
Advertisement 59.5
Collage 59.5
Poster 59.5
Screen 56.4
Electronics 56.4
Display 55.9
Monitor 55.9
LCD Screen 55.9
Person 41.6

Clarifai
created on 2019-11-16

people 99.7
adult 99.1
indoors 97.8
monochrome 97.1
man 96.8
vehicle 96.3
one 94.2
two 93.9
desk 93.8
furniture 91.8
group 91.5
room 91.1
woman 90.7
medicine 90.5
offense 90.3
hospital 88
wear 87.6
technology 85.5
sit 84.6
transportation system 84.2

Imagga
created on 2019-11-16

monitor 33.9
electronic equipment 31.5
equipment 29.2
computer 22.1
telephone 21.9
technology 19.3
device 18.4
black 18
display 17.9
pay-phone 17.1
office 17
television 16.5
business 16.4
screen 15.2
background 13.6
interior 13.3
working 13.2
desktop computer 13
people 12.8
modern 12.6
digital 12.1
desk 11.5
personal computer 11.4
indoors 11.4
studio 11.4
work 11
table 10.8
person 10.7
hand 10.6
professional 10.1
room 10.1
man 10.1
call 9.8
keyboard 9.5
window 9.5
furniture 9.2
adult 9
film 8.9
light 8.7
3d 8.5
digital computer 8.5
design 8.4
phone 8.3
one 8.2
laptop 8.2
style 8.2
music 8.1
looking 8
home 8
male 7.8
wall 7.7
communication 7.6
fashion 7.5
cord 7.4
occupation 7.3
appliance 7.3
businesswoman 7.3
sexy 7.2
body 7.2
women 7.1
electronic device 7.1
worker 7.1
businessman 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99
black and white 92.9
posing 60.3
mirror 55.1
set 30.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-16
Gender Female, 50%
Confused 45%
Surprised 48.5%
Sad 45.2%
Disgusted 45%
Calm 50.8%
Fear 45.1%
Angry 45.4%
Happy 45%

AWS Rekognition

Age 23-35
Gender Male, 54.1%
Fear 45.1%
Calm 48%
Angry 46.4%
Happy 45%
Confused 45.1%
Disgusted 45%
Sad 50.3%
Surprised 45.1%

AWS Rekognition

Age 14-26
Gender Female, 51.7%
Surprised 45.2%
Angry 45.2%
Sad 48.1%
Fear 45.2%
Disgusted 45.2%
Happy 46.3%
Calm 49.8%
Confused 45.1%

Feature analysis

Amazon

Horse 72.5%
Person 62.9%

Categories