Human Generated Data

Title

Untitled (family standing around table inside house, Nazaré, Portugal)

Date

1967

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.524.6

Human Generated Data

Title

Untitled (family standing around table inside house, Nazaré, Portugal)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.524.6

Machine Generated Data

Tags

Amazon
created on 2019-08-09

Human 99
Person 99
Apparel 98.9
Clothing 98.9
Person 98.8
Person 97.4
Person 95.8
Person 95.6
Person 94.5
Face 93.5
Person 81.8
Sleeve 76.1
Person 75.6
Coat 71.4
Overcoat 71.4
Suit 71.4
Finger 61.9
Portrait 61.9
Photo 61.9
Photography 61.9
Beard 61.8
Long Sleeve 58.7
Home Decor 56.5
Shirt 55.2
Senior Citizen 55.1

Clarifai
created on 2019-08-09

people 99.9
group 99.1
adult 98.7
group together 98.4
administration 96.5
leader 96.4
portrait 95.9
wear 95.7
man 95.2
several 93.9
many 91.6
five 90.6
woman 89.8
music 89.8
three 86.6
four 86.5
veil 85.1
outfit 83.1
two 82.4
one 80.8

Imagga
created on 2019-08-09

man 39.7
people 31.8
male 30.5
person 30
nurse 21.7
adult 21.7
businessman 20.3
professional 18.4
business 18.2
men 18
group 17.7
couple 16.5
office 14.6
senior 13.1
worker 12.9
teacher 12.4
room 11.7
portrait 11.6
job 11.5
black 11.4
happy 11.3
old 11.1
coat 11.1
team 10.7
human 10.5
corporate 10.3
sitting 10.3
teamwork 10.2
student 10.1
patient 10
holding 9.9
fashion 9.8
indoors 9.7
home 9.6
women 9.5
meeting 9.4
work 9.4
smiling 9.4
happiness 9.4
lab coat 9.1
silhouette 9.1
family 8.9
clothing 8.8
executive 8.8
lifestyle 8.7
businesspeople 8.5
mature 8.4
occupation 8.2
spectator 8.2
cheerful 8.1
handsome 8
idea 8
looking 8
to 8
medical 7.9
musician 7.9
color 7.8
world 7.8
travel 7.7
party 7.7
profession 7.6
tie 7.6
career 7.6
new 7.3
building 7.1
smile 7.1
night 7.1
interior 7.1
working 7.1
paper 7

Google
created on 2019-08-09

Photograph 97
Snapshot 85.6
Photography 62.4
Team 59.4
History 57.6

Microsoft
created on 2019-08-09

person 89.7
text 89
standing 82.7
posing 78.9
man 67.4
black and white 64.4
clothing 55.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-38
Gender Male, 95.5%
Disgusted 0.5%
Calm 25.3%
Sad 12.2%
Confused 4.4%
Fear 14.7%
Surprised 31.5%
Happy 3.9%
Angry 7.4%

AWS Rekognition

Age 50-68
Gender Male, 72.5%
Angry 83%
Fear 5.3%
Confused 0.2%
Calm 4.2%
Happy 0.1%
Sad 0.4%
Disgusted 0.2%
Surprised 6.6%

AWS Rekognition

Age 31-47
Gender Female, 56.5%
Disgusted 0.6%
Sad 79.9%
Fear 6.3%
Happy 0.1%
Surprised 0.2%
Confused 3.6%
Calm 0.2%
Angry 9.1%

AWS Rekognition

Age 31-47
Gender Male, 85.4%
Fear 40%
Surprised 18.7%
Sad 7%
Calm 4.1%
Happy 2.2%
Angry 23.6%
Confused 3.7%
Disgusted 0.7%

AWS Rekognition

Age 40-58
Gender Male, 96.9%
Confused 1.4%
Calm 9.2%
Angry 5.1%
Sad 78.1%
Surprised 0.5%
Fear 4.7%
Happy 0.3%
Disgusted 0.6%

AWS Rekognition

Age 32-48
Gender Female, 51.4%
Disgusted 1.2%
Happy 0.7%
Sad 72.3%
Calm 8.4%
Fear 2.3%
Angry 2.3%
Confused 10.4%
Surprised 2.5%

AWS Rekognition

Age 33-49
Gender Male, 63.1%
Angry 48.6%
Disgusted 7.4%
Sad 32.7%
Surprised 1.6%
Fear 2%
Calm 4.9%
Happy 0.3%
Confused 2.5%

Feature analysis

Amazon

Person 99%

Categories

Imagga

people portraits 81.1%
events parties 18.9%

Text analysis

Amazon

rrpE
RB

Google

EVS
EVS