Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (family in Christmas living room with woman supporting baby standing on floor)

Date

1955

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9519

Human Generated Data

Title

Untitled (family in Christmas living room with woman supporting baby standing on floor)

People

Artist: Martin Schweig, American 20th century

Date

1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9519

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.4
Human 99.4
Person 98.6
Person 98.1
Person 87.9
Interior Design 85.5
Indoors 85.5
Clothing 85
Apparel 85
Room 78
People 66.8
Meal 59.2
Food 59.2
Table Lamp 58.2
Lamp 58.2
Chair 57.7
Furniture 57.7

Clarifai
created on 2023-10-26

people 100
group 99
monochrome 98.8
group together 98.8
child 98.2
adult 97.8
many 97.1
street 95.1
woman 94.4
man 94.3
wear 93.2
several 91.4
two 91.1
three 90.1
war 89.7
recreation 89.2
administration 84.2
leader 83.5
commerce 81.6
music 80.8

Imagga
created on 2022-01-28

man 26.2
male 24.9
people 21.7
person 20
work 17.4
business 16.4
musical instrument 15.4
adult 14.9
men 14.6
dad 14.5
black 13.8
businessman 13.2
room 12.7
portrait 12.3
hand 12.1
group 12.1
happy 11.9
happiness 11.7
father 11.6
old 11.1
smiling 10.8
job 10.6
office 10.4
sitting 10.3
building 10.2
life 10.1
silhouette 9.9
professional 9.7
new 9.7
couple 9.6
education 9.5
day 9.4
construction 9.4
technology 8.9
home 8.8
parent 8.3
sky 8.3
holding 8.2
light 8.1
team 8.1
stringed instrument 7.8
face 7.8
color 7.8
school 7.6
serious 7.6
student 7.4
camera 7.4
worker 7.4
patient 7.4
looking 7.2
smile 7.1
women 7.1
medical 7.1
equipment 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 97.3
man 95.5
text 93.2
clothing 81.2
air 67.8
jumping 66.4
doing 62.8
trick 55.9
male 19.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-28
Gender Male, 72.9%
Calm 91.9%
Happy 5%
Sad 1.4%
Surprised 0.5%
Angry 0.4%
Confused 0.3%
Fear 0.2%
Disgusted 0.2%

Feature analysis

Amazon

Person
Person 99.4%

Categories

Text analysis

Amazon

e
:
STRUPT
2019