Human Generated Data

Title

Untitled (two photographs: older man in tux and flowered cap leaning back in chair; young couple seen through window of airplane)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12814

Human Generated Data

Title

Untitled (two photographs: older man in tux and flowered cap leaning back in chair; young couple seen through window of airplane)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12814

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.1
Person 99.1
Person 98
Advertisement 95.8
Poster 95.8
Collage 95.8
Person 94.2
Apparel 92.3
Clothing 92.3
Helmet 87.1
Face 77.9
Transportation 67.5
Vehicle 67.3
Overcoat 66.6
Coat 66.6
Suit 66.6
Female 66.1
Train 63.3
People 62.8
Mammal 62.1
Cat 62.1
Pet 62.1
Animal 62.1
Shorts 58.4
Car Wheel 57.7
Wheel 57.7
Tire 57.7
Machine 57.7
Finger 55.1

Clarifai
created on 2019-11-16

people 99.7
vehicle 98
woman 97.9
monochrome 97.4
adult 97.4
transportation system 96.7
man 96.5
car 96.2
two 95.6
one 93.4
vehicle window 93.3
portrait 93.2
hospital 92.9
child 92.3
indoors 89.8
sleep 88.2
room 87.2
wear 85.5
baby 84.9
group 84.8

Imagga
created on 2019-11-16

television 49.7
equipment 32.7
electronic equipment 31.8
broadcasting 22.3
black 21
business 17
technology 16.3
telecommunication system 16.2
telecommunication 16.1
tape player 15.3
computer 14.9
man 14.8
modern 14.7
people 14.5
person 14.3
design 14.1
device 13.8
work 13.3
interior 13.3
male 12.8
monitor 12.4
cassette player 12.1
job 11.5
laptop 11.2
room 11
digital 10.5
office 10.4
entertainment 10.1
medium 10.1
bright 10
music 9.9
audience 9.7
steel 9.7
businessman 9.7
metal 9.6
sexy 9.6
crowd 9.6
home 9.6
player 9.4
flag 9.2
indoor 9.1
studio 9.1
silhouette 9.1
style 8.9
cheering 8.8
nighttime 8.8
stadium 8.7
symbol 8.7
furniture 8.7
light 8.7
patriotic 8.6
nation 8.5
lights 8.3
safe 8.2
success 8
icon 7.9
indoors 7.9
adult 7.8
empty 7.7
skill 7.7
media 7.6
finance 7.6
screen 7.6
communication 7.6
dark 7.5
electronic 7.5
teamwork 7.4
new 7.3
team 7.2
cassette 7.1
financial 7.1
silver 7.1
working 7.1
box 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 97
indoor 93.8
black and white 87.1
person 80.1
clothing 78.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 13-23
Gender Male, 54.9%
Surprised 45.2%
Fear 45.8%
Happy 45.1%
Sad 45.2%
Calm 53.5%
Disgusted 45.1%
Angry 45.1%
Confused 45%

AWS Rekognition

Age 8-18
Gender Female, 54.9%
Calm 45.1%
Happy 45.1%
Disgusted 45.1%
Surprised 45.6%
Angry 45.2%
Sad 45.2%
Fear 53.5%
Confused 45.1%

AWS Rekognition

Age 32-48
Gender Female, 53.8%
Disgusted 45%
Calm 45.3%
Angry 45.2%
Confused 45%
Fear 45.4%
Sad 45.1%
Surprised 45.2%
Happy 53.9%

AWS Rekognition

Age 15-27
Gender Male, 54.1%
Happy 45.3%
Disgusted 45.1%
Angry 46.4%
Fear 52.4%
Calm 45%
Surprised 45%
Sad 45.7%
Confused 45%

AWS Rekognition

Age 24-38
Gender Male, 53.2%
Sad 45.2%
Disgusted 45%
Confused 45.1%
Surprised 45.5%
Angry 45.4%
Calm 50.9%
Happy 47.9%
Fear 45%

AWS Rekognition

Age 13-25
Gender Male, 51.7%
Angry 45%
Calm 45%
Sad 45.1%
Happy 45%
Fear 54.7%
Confused 45%
Surprised 45.1%
Disgusted 45%

AWS Rekognition

Age 16-28
Gender Male, 54.8%
Fear 45.6%
Calm 53.1%
Surprised 45.4%
Confused 45.1%
Disgusted 45.4%
Happy 45.1%
Sad 45.2%
Angry 45.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Helmet 87.1%
Train 63.3%
Cat 62.1%

Categories

Imagga

paintings art 71.7%
food drinks 21.8%
interior objects 4.4%