Human Generated Data

Title

Untitled (portrait of old man with young girl in his lap inside house)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9143

Human Generated Data

Title

Untitled (portrait of old man with young girl in his lap inside house)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Chair 100
Furniture 100
Armchair 97
Person 96.2
Human 96.2
Clothing 84.3
Apparel 84.3
Couch 68.3

Imagga
created on 2022-01-23

man 47.7
grandfather 42
male 41.2
home 38.3
senior 37.5
person 35
people 29
indoors 29
adult 28.4
sitting 27.5
laptop 27.1
computer 25.9
elderly 24.9
mature 22.3
room 21.9
smiling 21
chair 20.3
office 20.3
men 19.7
businessman 19.4
couple 19.2
happy 18.8
retired 18.4
business 18.2
old 17.4
house 16.7
retirement 16.3
looking 16
working 15.9
together 15.8
casual 15.2
work 15.2
family 15.1
lifestyle 14.5
worker 14.4
scholar 13.7
armchair 13.6
portrait 13.6
technology 13.4
couch 12.6
handsome 12.5
health 11.8
executive 11.6
husband 11.4
grandma 11.4
modern 11.2
seat 11.2
women 11.1
indoor 11
professional 10.9
intellectual 10.9
smile 10.7
older 10.7
face 10.6
using 10.6
teacher 10.5
horizontal 10
confident 10
relaxing 10
pensioner 10
furniture 9.9
living room 9.8
cheerful 9.8
one 9.7
monitor 9.6
table 9.5
reading 9.5
happiness 9.4
relaxed 9.4
two 9.3
camera 9.2
alone 9.1
holding 9.1
70s 8.9
seated 8.8
concentration 8.7
corporate 8.6
businesspeople 8.5
wife 8.5
living 8.5
suit 8.5
glasses 8.3
leisure 8.3
inside 8.3
aged 8.1
active 8.1
interior 8
job 8
notebook 7.9
60s 7.8
attractive 7.7
talking 7.6
desk 7.6
meeting 7.5
newspaper 7.5
screen 7.5
phone 7.4
20s 7.3
successful 7.3
lady 7.3
father 7.2
team 7.2
love 7.1
day 7.1
medical 7.1
look 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

furniture 97.4
chair 94.9
text 92.4
black and white 90.9
person 86.6
clothing 82.6
couch 61.4

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 75%
Surprised 60.3%
Calm 32.5%
Fear 4.9%
Angry 0.6%
Disgusted 0.5%
Sad 0.4%
Confused 0.4%
Happy 0.3%

AWS Rekognition

Age 45-51
Gender Male, 100%
Calm 76.4%
Angry 8%
Surprised 5.6%
Happy 3.9%
Disgusted 2.6%
Confused 1.7%
Sad 1.1%
Fear 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.2%

Captions

Microsoft

a man sitting on a table 80.5%
a man sitting on a bed 66.1%
a man sitting in a chair 66%

Text analysis

Amazon

a
MJR