Human Generated Data

Title

Untitled (view of wedding banquet table seated with guests and wedding party)

Date

1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6529

Human Generated Data

Title

Untitled (view of wedding banquet table seated with guests and wedding party)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6529

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Furniture 99.7
Table 98.9
Dining Table 98.9
Person 98.1
Human 98.1
Person 96.9
Person 95.4
Indoors 93.3
Dining Room 93.3
Room 93.3
Restaurant 90
Person 84.1
Food 77.3
Meal 77.3
Person 75.7
Dish 69.8
Person 69.7
Person 61.1
Chess 60.6
Game 60.6
Person 60.4
Chair 60.3
Tabletop 59.4
Cafe 58
Pottery 55.2

Clarifai
created on 2019-03-25

people 99.2
group 97.1
monochrome 95.8
adult 94.9
many 94.6
man 93.6
wear 92.3
group together 90.2
furniture 88.9
woman 87.7
chair 86.2
several 82.9
veil 81.5
one 79.8
black and white 79.5
illustration 78.8
room 77.9
design 77.7
art 77.6
vehicle 76.9

Imagga
created on 2019-03-25

case 54.6
glass 18.9
history 13.4
decoration 12.3
shop 11.5
technology 11.1
antique 10.9
table 10.6
medicine 10.6
ancient 10.4
people 10
retro 9.8
equipment 9.8
old 9.7
medical 9.7
perfume 9.5
religion 9
science 8.9
art 8.8
setting 8.7
mercantile establishment 8.5
black 8.4
city 8.3
wedding 8.3
vintage 8.3
gold 8.2
man 7.7
health 7.6
sculpture 7.5
historical 7.5
film 7.5
tourism 7.4
person 7.4
work 7.3
business 7.3
holiday 7.2
interior 7.1
travel 7

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

text 98.3
person 94.4
black and white 94.4
winter 57.3
monochrome 24.1
street 20.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 53.1%
Angry 45.2%
Sad 45.4%
Surprised 45.1%
Confused 45.1%
Calm 53.9%
Happy 45.2%
Disgusted 45.1%

AWS Rekognition

Age 35-52
Gender Female, 51.7%
Happy 45%
Disgusted 45%
Calm 53.2%
Sad 46.6%
Surprised 45.1%
Confused 45.1%
Angry 45.1%

AWS Rekognition

Age 48-68
Gender Male, 54.7%
Happy 45.1%
Sad 45.9%
Surprised 45.2%
Disgusted 45.2%
Angry 45.4%
Calm 52.9%
Confused 45.3%

AWS Rekognition

Age 35-55
Gender Male, 52.3%
Disgusted 46.5%
Surprised 46.2%
Sad 46%
Confused 45.6%
Angry 45.7%
Happy 46.3%
Calm 48.6%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Happy 49.5%
Surprised 49.6%
Sad 49.6%
Confused 49.6%
Angry 49.5%
Disgusted 49.5%
Calm 50.2%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Calm 49.9%
Sad 49.6%
Disgusted 49.7%
Happy 49.6%
Confused 49.5%
Angry 49.6%
Surprised 49.6%

AWS Rekognition

Age 4-9
Gender Male, 50%
Sad 50.1%
Happy 49.5%
Confused 49.6%
Surprised 49.6%
Disgusted 49.6%
Calm 49.6%
Angry 49.6%

Feature analysis

Amazon

Person 98.1%
Chess 60.6%
Chair 60.3%

Text analysis

Amazon

XAGOX
Y