Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (two women having tea)

Date

1950

People

Artist: Peter James Studio, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20054

Human Generated Data

Title

Untitled (two women having tea)

People

Artist: Peter James Studio, American

Date

1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.20054

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.7
Human 99.7
Person 99.6
Pottery 97.2
Saucer 95.1
Table 92.9
Furniture 92.9
Sunglasses 84.1
Accessories 84.1
Accessory 84.1
Porcelain 82.1
Art 82.1
Restaurant 76.1
Cup 73.7
Coffee Cup 73.6
Dish 69.9
Meal 69.9
Food 69.9
Pot 63.6
Photography 61.4
Photo 61.4
Teapot 61.4
Bowl 59.7
Dining Table 59.4
Food Court 56.8

Clarifai
created on 2023-10-22

people 99.7
adult 98.9
woman 98.7
group 97.4
man 96.9
monochrome 96.4
indoors 96.3
sit 94.9
two 93.3
table 91.9
portrait 91.7
three 90.5
four 90.5
group together 90.2
restaurant 89.6
furniture 88.3
room 84
hotel 84
drink 83.5
coffee 82.7

Imagga
created on 2022-03-05

man 45.1
person 33.2
people 31.2
male 30.5
business 26.7
adult 25.1
waiter 25
working 24.7
team 24.2
businessman 23.8
coat 23.5
lab coat 23.2
table 22.6
sitting 22.3
office 22.2
30s 22.1
meeting 21.7
work 21.2
professional 21.1
happy 20
men 19.7
worker 19.7
20s 19.2
smiling 18.8
employee 18.4
businesswoman 18.2
talking 18
colleagues 17.5
lifestyle 16.6
couple 16.5
businesspeople 16.1
dining-room attendant 16.1
medical 15
indoors 14.9
teamwork 14.8
home 14.3
desk 14.3
women 14.2
together 14
mature 13.9
restaurant 13.9
40s 13.6
smile 13.5
doctor 13.1
senior 13.1
group 12.9
patient 12.9
day 12.5
clothing 12.3
adults 12.3
portrait 12.3
bright 12.1
executive 12
room 12
casual 11.9
laptop 11.8
suit 11.8
holding 11.5
food 11.5
computer 11.2
dinner 11.2
camera 11.1
indoor 10.9
drink 10.9
coworkers 10.8
having 10.6
job 10.6
mid adult 10.6
coffee 10.2
wine 10.2
four people 9.9
kitchen 9.8
discussion 9.7
daytime 9.6
hospital 9.5
two 9.3
attractive 9.1
cheerful 8.9
color 8.9
boardroom 8.9
garment 8.9
collaboration 8.9
associates 8.8
discussing 8.8
days 8.8
scientist 8.8
looking 8.8
half length 8.8
nurse 8.7
thirties 8.7
lab 8.7
cooperation 8.7
four 8.6
happiness 8.6
student 8.6
corporate 8.6
meal 8.4
communication 8.4
emotion 8.3
confident 8.2
handsome 8
to 8
interior 8
standing 7.8
assistant 7.8
busy 7.7
twenties 7.6
health 7.6
hand 7.6
eating 7.6
focus 7.4
light 7.3
life 7.3
success 7.2
science 7.1
face 7.1
medicine 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.4
tableware 97.7
person 97.4
black and white 93.7
bottle 82.4
man 81.2
clothing 76
table 70.6
human face 67
drink 65.5
coffee cup 62.4
coffee 57.3
saucer 54.4
cup 50.5
dinner 31.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 74.4%
Happy 96.4%
Surprised 1.8%
Sad 0.6%
Confused 0.5%
Calm 0.2%
Disgusted 0.2%
Fear 0.1%
Angry 0.1%

Feature analysis

Amazon

Person
Sunglasses
Dining Table
Person 99.7%

Text analysis

Amazon

KODASEIA