Human Generated Data

Title

Untitled (old couple)

Date

c. 1920

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.720

Human Generated Data

Title

Untitled (old couple)

People

Artist: Martin Schweig, American 20th century

Date

c. 1920

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.720

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 99.8
Apparel 99.8
Person 96.7
Human 96.7
Suit 96.6
Overcoat 96.6
Coat 96.6
Person 87.5
Home Decor 85.1
Sleeve 69.2
Person 62.5
Linen 59.4
Plant 58.8
Tuxedo 58.6
Face 55.7

Clarifai
created on 2023-10-25

people 100
portrait 99.8
group 99.4
two 99.3
adult 97.8
three 97.7
wear 96.7
man 96.7
wedding 96.3
menswear 94.9
leader 94.2
administration 93.9
outfit 93.1
four 93
woman 92.8
group together 87.9
retro 86.7
neckwear 85.6
actor 85.5
groom 84.9

Imagga
created on 2022-01-08

man 46.4
bow tie 44.6
suit 42.9
male 39.9
businessman 36.2
necktie 35.5
kin 34.5
person 32
adult 31.2
business 31
people 30.1
executive 27.4
portrait 26.5
happy 26.3
handsome 23.2
couple 22.7
tie 21.8
professional 21.5
garment 21.3
confident 20.9
office 20.9
standing 20.9
smiling 19.5
corporate 18.9
smile 18.5
success 18.5
men 18
jacket 17.5
senior 16.9
businesswoman 16.4
expression 16.2
family 16
clothing 15.7
successful 15.6
love 14.2
face 14.2
job 14.2
happiness 14.1
attractive 14
cheerful 13.8
lifestyle 13.7
alone 13.7
businesspeople 13.3
indoors 13.2
together 13.1
mature 13
grandma 13
looking 12.8
old 12.6
hand 12.2
home 12
holding 11.6
shirt 11.2
black 11.1
work 11
team 10.8
fashion 10.6
married 10.6
elderly 10.5
boss 10.5
modern 10.5
human 10.5
group 10.5
20s 10.1
friendly 10.1
groom 9.8
lady 9.7
style 9.6
30s 9.6
hair 9.5
two 9.3
teamwork 9.3
pretty 9.1
one 9
color 8.9
worker 8.8
husband 8.8
formal 8.6
elegant 8.6
casual 8.5
company 8.4
occupation 8.3
indoor 8.2
aged 8.2
building 7.9
day 7.9
60s 7.8
businessmen 7.8
colleagues 7.8
sibling 7.7
corporation 7.7
partnership 7.7
confidence 7.7
only 7.6
adults 7.6
meeting 7.5
glasses 7.4
wedding 7.4
room 7.3
women 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

person 99.6
standing 98.2
man 97.1
clothing 96.9
text 96.7
old 96.4
smile 95.2
human face 93.1
posing 93
suit 90.6
black 86.9
white 69.2
tie 66.3
vintage 31.2
clothes 25.4
necktie 16.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 58-66
Gender Female, 99.9%
Happy 91.6%
Calm 5%
Disgusted 1.5%
Surprised 0.4%
Confused 0.4%
Sad 0.4%
Angry 0.4%
Fear 0.3%

AWS Rekognition

Age 56-64
Gender Male, 100%
Calm 99.7%
Confused 0.1%
Sad 0.1%
Angry 0%
Surprised 0%
Happy 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 23-33
Gender Female, 66.1%
Fear 77.8%
Calm 15.4%
Sad 2.8%
Surprised 2.4%
Angry 0.5%
Confused 0.4%
Disgusted 0.4%
Happy 0.3%

Microsoft Cognitive Services

Age 57
Gender Male

Microsoft Cognitive Services

Age 24
Gender Male

Microsoft Cognitive Services

Age 68
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.7%
Suit 96.6%

Categories

Imagga

paintings art 99%

Text analysis

Amazon

SAID2