Human Generated Data

Title

Untitled (man and seated family posing in home furnishings store)

Date

1952

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6298

Human Generated Data

Title

Untitled (man and seated family posing in home furnishings store)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Restaurant 99.1
Person 98.7
Human 98.7
Person 97.9
Person 97.4
Person 97.1
Apparel 95.8
Clothing 95.8
Cafe 91.5
Cafeteria 89.1
Furniture 85.9
Couch 85.9
Chair 84.1
Accessories 82.9
Sunglasses 82.9
Accessory 82.9
Food 76.7
Meal 76.7
Sitting 76.3
Overcoat 59.6
Coat 59.6
Table 59.3
Food Court 55.9
Couch 55.7

Imagga
created on 2022-01-22

salon 43
shop 31.1
barbershop 25.7
people 25.6
business 24.9
man 23.5
mercantile establishment 20
center 18.9
interior 18.6
chair 18.3
person 17.7
work 17.3
restaurant 16.5
computer 15.2
city 14.9
indoors 14.9
male 14.9
men 14.6
office 14.5
building 14.4
room 14.3
counter 14.2
working 14.1
urban 14
hairdresser 13.8
adult 13.8
occupation 13.7
place of business 13.6
businessman 13.2
table 12.1
travel 12
modern 11.9
women 11.9
furniture 11.8
design 11.8
architecture 11.7
indoor 10.9
transportation 10.7
job 10.6
equipment 10.6
glass 10.1
light 10
businesswoman 10
worker 9.8
professional 9.7
style 9.6
crowd 9.6
smiling 9.4
lifestyle 9.4
two 9.3
horizontal 9.2
inside 9.2
technology 8.9
sitting 8.6
barroom 8.5
smile 8.5
industry 8.5
machine 8.5
happy 8.1
chairs 7.8
station 7.7
empty 7.7
casual 7.6
laptop 7.5
silhouette 7.4
lights 7.4
teamwork 7.4
back 7.3
transport 7.3
seat 7.3
industrial 7.3
black 7.2
monitor 7.2
home 7.2
hall 7.2
night 7.1

Google
created on 2022-01-22

Hat 87
Black-and-white 86
Style 83.9
Line 81.6
Monochrome photography 74.4
Monochrome 72.4
Room 70.3
Sitting 68.1
Chair 67.8
Table 67.6
Stock photography 65.1
Couch 64.6
Font 63.6
Customer 63.4
Street 61.6
Suit 60.5
Coffee table 55.3
Photographic paper 54.9
Shelf 53.8
Conversation 52.5

Microsoft
created on 2022-01-22

text 97.2
clothing 92.4
person 91.4
black and white 88.8
furniture 70.8
woman 68.1
chair 64

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 98%
Calm 99.7%
Sad 0.1%
Surprised 0%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Couch 85.9%
Sunglasses 82.9%

Captions

Microsoft

a group of people sitting in front of a store 73.1%
a group of people standing in front of a store 73%
a group of people in front of a store 72.9%

Text analysis

Amazon

ea
KODVK-SVEEIA

Google

CU
CU