Human Generated Data

Title

Untitled (couple opening a gift)

Date

1959, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.197

Human Generated Data

Title

Untitled (couple opening a gift)

People

Artist: Martin Schweig, American 20th century

Date

1959, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.197

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Furniture 99.8
Human 99.1
Person 99.1
Person 97.9
Couch 95.9
Indoors 93.9
Living Room 93.9
Room 93.9
Table Lamp 87.9
Lamp 87.9
Sitting 76.7
Home Decor 58.6
Shelf 58.2

Clarifai
created on 2023-10-25

people 100
adult 98.2
two 98.1
group 97.5
woman 96.4
furniture 96
portrait 95.6
man 95.3
group together 95
room 94.1
three 92.8
leader 92.3
monochrome 92
chair 91
child 89.7
four 89.6
wedding 89.6
seat 87.8
home 85.4
administration 84.9

Imagga
created on 2022-01-08

home 35.9
man 35.7
adult 34.5
male 34.4
people 32.9
person 32.3
laptop 28.4
office 26.3
businessman 25.6
business 25.5
indoors 25.5
businesswoman 25.5
meeting 25.5
happy 25.1
sitting 24.9
computer 24.9
couple 22.7
group 22.6
together 21.9
smiling 21.7
director 21.7
businesspeople 20.9
room 20.2
20s 19.3
smile 18.6
colleagues 18.5
team 17.9
women 17.4
indoor 17.4
30s 17.3
men 17.2
table 16.5
couch 16.4
corporate 16.3
child 16.3
desk 16.1
working 15.9
house 15.9
work 15.7
mother 14.3
talking 14.3
horizontal 14.2
family 14.2
females 14.2
scholar 14.2
lifestyle 13.7
portrait 13.6
happiness 13.3
attractive 13.3
adults 13.3
professional 13.1
nurse 12.7
40s 12.7
cheerful 12.2
teamwork 12.1
looking 12
groom 11.9
discussion 11.7
executive 11.4
face 11.4
intellectual 11.3
casual 11
patient 11
father 10.9
interior 10.6
education 10.4
senior 10.3
mature 10.2
dad 10.2
color 10
associates 9.8
discussing 9.8
coworkers 9.8
modern 9.8
conference 9.8
job 9.7
student 9.7
workplace 9.5
suit 9.4
two 9.3
presentation 9.3
life 9.3
children 9.1
holding 9.1
worker 9.1
classroom 9
teacher 9
boardroom 8.9
parent 8.9
husband 8.9
businessmen 8.8
corporation 8.7
expression 8.5
communication 8.4
document 8.4
coat 8.2
beverage 8.1
cup 8.1
success 8.1
handsome 8
to 8
jacket 8
daughter 7.9
boy 7.8
two people 7.8
partners 7.8
sofa 7.7
busy 7.7
break 7.6
chair 7.6
manager 7.5
camera 7.4
emotion 7.4
inside 7.4
hospital 7.3
successful 7.3
girls 7.3
love 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 98
sitting 97.6
person 95.5
indoor 89.1
clothing 87.6
black and white 67.3
man 64
family 17.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-22
Gender Male, 99.7%
Happy 98.4%
Sad 0.5%
Confused 0.3%
Surprised 0.2%
Fear 0.2%
Angry 0.2%
Disgusted 0.1%
Calm 0.1%

Microsoft Cognitive Services

Age 28
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Couch 95.9%

Categories

Imagga

people portraits 91.3%
food drinks 4.3%
paintings art 3.6%

Text analysis

Amazon

arlboro
JULL
-
as
YENNYOKA

Google

arlboro
arlboro