Human Generated Data

Title

Untitled (groom feeding cake to bride at reception)

Date

1949-1967

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6174

Human Generated Data

Title

Untitled (groom feeding cake to bride at reception)

People

Artist: Martin Schweig, American 20th century

Date

1949-1967

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6174

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.4
Human 99.4
Person 99.1
Food 99
Meal 99
Cake 99
Dessert 99
Icing 98.9
Cream 98.9
Creme 98.9
Dish 92.6
Home Decor 84.8
Person 83.3
Clothing 76.8
Apparel 76.8
People 70.2
Furniture 69
Table 69
Chair 66
Dining Table 65.2
Torte 61.4
Cafeteria 58.5
Restaurant 58.5

Clarifai
created on 2023-10-26

people 99.7
group 96.5
adult 96.4
man 95.9
two 95.3
three 93.4
monochrome 91.7
furniture 90.7
group together 90.3
room 88.8
indoors 87.8
chair 86
leader 82.8
four 82.8
woman 82.8
scientist 79.9
several 77.6
dining room 75.5
administration 73.4
music 72.6

Imagga
created on 2022-01-23

man 30.2
person 26.2
television 24.1
people 22.9
bartender 22.5
male 19.8
senior 19.7
telecommunication system 16.9
lifestyle 15.9
adult 15.7
computer 15.7
indoors 14.9
mature 14.9
office 14.6
business 14.6
elderly 14.4
home 14.3
happy 13.8
looking 13.6
kitchen 13.4
sitting 12.9
monitor 12.6
old 12.5
room 12.4
working 12.4
portrait 12.3
health 11.8
handsome 11.6
black 11.4
education 11.2
blackboard 11.2
indoor 10.9
smiling 10.8
salon 10.7
laptop 10.6
class 10.6
technology 10.4
casual 10.2
table 9.8
businessman 9.7
one 9.7
professional 9.7
retired 9.7
restaurant 9.6
couple 9.6
work 9.4
hand 9.1
modern 9.1
interior 8.8
shop 8.8
worker 8.7
desk 8.7
retirement 8.6
age 8.6
screen 8.5
communication 8.4
attractive 8.4
holding 8.2
active 8.1
equipment 8
grandma 8
information 8
medical 7.9
hair 7.9
love 7.9
cooking 7.9
smile 7.8
classroom 7.8
men 7.7
reading 7.6
teacher 7.5
food 7.4
glasses 7.4
clothing 7.4
back 7.3
alone 7.3
chair 7.2
science 7.1
day 7.1
together 7

Google
created on 2022-01-23

Tableware 93.4
Food 92.7
Table 92.6
Drinkware 91.9
Black 89.6
Stemware 88.8
Plate 87.4
Serveware 85.2
Hat 84.9
Black-and-white 84
Chair 81.6
Cake 80.6
Dishware 80
Barware 79.3
Glass 75.2
Suit 74.2
Monochrome photography 73.6
Dessert 72.3
Art 71.6
Monochrome 71.4

Microsoft
created on 2022-01-23

text 99.5
person 97.6
man 91.8
black and white 84.7
window 84.1
birthday cake 72.3
candle 66.7
clothing 61.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Female, 88%
Calm 91.9%
Surprised 4%
Confused 1.1%
Happy 1%
Fear 0.6%
Disgusted 0.5%
Angry 0.5%
Sad 0.4%

AWS Rekognition

Age 29-39
Gender Female, 99.4%
Surprised 35.5%
Sad 22.2%
Happy 17.2%
Fear 9.5%
Calm 7.5%
Confused 3.9%
Angry 2.2%
Disgusted 2.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Chair 66%

Categories