Human Generated Data

Title

Untitled (rephotographed vintage portrait of eight member family)

Date

c. 1950-1960

People

Artist: Claseman Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11128

Human Generated Data

Title

Untitled (rephotographed vintage portrait of eight member family)

People

Artist: Claseman Studio, American 20th century

Date

c. 1950-1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11128

Machine Generated Data

Tags

Amazon
created on 2019-03-26

Human 99.7
Person 99.7
Person 99.3
Person 99.2
Person 99.2
Person 99.1
Person 98.9
Apparel 97.9
Clothing 97.9
Person 97.7
Person 95.8
Monitor 89.5
Display 89.5
Electronics 89.5
Screen 89.5
Coat 85.6
Accessories 84.7
Accessory 84.7
Tie 84.7
People 77.9
Overcoat 74.6
Suit 74.6
Shirt 68.2
Tie 65.2
Crowd 64

Clarifai
created on 2019-03-26

people 99.9
group 99.7
adult 97.8
group together 97.8
many 96.9
leader 95.9
portrait 94.7
man 94.4
television 91.6
wear 89.8
medical practitioner 88.7
woman 88.5
several 88
education 87.6
three 87.6
four 86.7
music 86.2
facial expression 84.9
administration 84.6
five 83.9

Imagga
created on 2019-03-26

television 100
telecommunication system 62.9
broadcasting 31.9
monitor 29.4
man 28.9
male 25.5
telecommunication 23.8
business 23.7
businessman 22
office 21.7
senior 21.5
person 21.4
people 21.2
portrait 18.7
adult 17.5
electronic equipment 16.4
old 16
equipment 15.8
medium 15.8
one 15.7
computer 14.7
executive 14.5
black 14.4
looking 14.4
elderly 14.3
work 14.1
mature 13.9
desk 13.2
happy 13.1
couple 13.1
modern 12.6
laptop 12
glasses 12
professional 11.9
smiling 11.6
boss 11.5
sitting 11.2
technology 11.1
money 11
suit 10.8
60s 10.7
handsome 10.7
older 10.7
age 10.5
tie 10.4
manager 10.2
indoor 10
smile 10
education 9.5
home theater 9.2
confident 9.1
currency 9
working 8.8
hair 8.7
corporate 8.6
men 8.6
face 8.5
finance 8.4
banking 8.3
liquid crystal display 8.3
occupation 8.2
blackboard 8.2
aged 8.1
success 8
worker 8
home 8
table 7.8
classroom 7.8
retired 7.7
leader 7.7
formal 7.6
serious 7.6
hand 7.6
career 7.6
dollar 7.4
symbol 7.4
theater 7.3
active 7.2
financial 7.1
job 7.1

Google
created on 2019-03-26

Microsoft
created on 2019-03-26

wall 98.2
window 97.6
posing 97.3
old 94.6
monitor 90.5
picture frame 89.4
person 85.3
gallery 72.2
group 63.8
image 35.9
display 30.4
museum 30.4
black and white 6.7
art 5.6
domain 4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Male, 91.8%
Angry 2.2%
Calm 82.7%
Sad 9.1%
Disgusted 0.6%
Surprised 1.8%
Confused 2.7%
Happy 0.9%

AWS Rekognition

Age 35-52
Gender Male, 92.7%
Sad 7.6%
Happy 1.4%
Disgusted 1.2%
Calm 84%
Angry 2.1%
Confused 1.5%
Surprised 2.2%

AWS Rekognition

Age 26-43
Gender Male, 98.8%
Happy 1.3%
Confused 0.9%
Calm 92.8%
Surprised 0.9%
Sad 2.6%
Angry 0.8%
Disgusted 0.8%

AWS Rekognition

Age 26-43
Gender Male, 87.4%
Surprised 3.9%
Sad 26.3%
Angry 4.8%
Calm 54.8%
Happy 1.7%
Confused 6.4%
Disgusted 2.2%

AWS Rekognition

Age 26-44
Gender Male, 96.3%
Happy 0.7%
Disgusted 0.6%
Calm 84.9%
Confused 1.1%
Surprised 1.2%
Angry 2.4%
Sad 9%

AWS Rekognition

Age 29-45
Gender Male, 75.7%
Calm 49.9%
Happy 5.8%
Angry 4.4%
Disgusted 4.7%
Confused 4.7%
Sad 26.4%
Surprised 4.1%

AWS Rekognition

Age 26-43
Gender Male, 85.8%
Sad 19.3%
Calm 56.4%
Surprised 5.6%
Angry 5.3%
Confused 4.5%
Disgusted 2.4%
Happy 6.5%

AWS Rekognition

Age 26-43
Gender Female, 85.9%
Disgusted 0.5%
Sad 18.9%
Angry 2%
Happy 3.6%
Calm 72.4%
Confused 1.3%
Surprised 1.4%

Feature analysis

Amazon

Person 99.7%
Monitor 89.5%
Tie 84.7%

Categories

Text analysis

Amazon

AL3ES-
00t

Google

KODAKSOAFETY
KODAKSOAFETY