Human Generated Data

Title

Untitled (man and woman standing behind jewelry counter)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6251

Human Generated Data

Title

Untitled (man and woman standing behind jewelry counter)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6251

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.6
Human 99.6
Person 99.4
Sweets 99.2
Confectionery 99.2
Food 99.2
Bakery 96.8
Shop 96.8
Shelf 95.7

Clarifai
created on 2023-10-26

people 99.9
adult 98.8
commerce 98.1
two 98.1
one 97.9
group 97.7
man 96.5
furniture 95.8
stock 94.9
monochrome 94.6
many 93.5
three 91.2
group together 90.2
shelf 90
market 89.7
merchant 89.3
wear 88.9
room 88.6
vehicle 88
employee 85.4

Imagga
created on 2022-01-22

shop 64.5
tobacco shop 53.2
mercantile establishment 49
place of business 32.5
people 27.9
marimba 24.8
man 23.5
bakery 23.2
business 23.1
counter 20.7
percussion instrument 20
male 19.1
person 18.1
bartender 17.3
musical instrument 16.5
establishment 15.8
smiling 15.2
work 14.9
old 14.6
men 13.7
building 13.7
city 13.3
architecture 13.3
businessman 13.2
restaurant 12.5
indoors 12.3
adult 12.1
portrait 11.6
office 11.5
happy 11.3
senior 11.2
looking 11.2
women 11.1
horizontal 10.9
stall 10.9
night 10.6
sitting 10.3
money 10.2
one 9.7
couple 9.6
standing 9.5
professional 9.5
store 9.4
lifestyle 9.4
worker 9.1
black 9
technology 8.9
urban 8.7
room 8.6
smile 8.5
finance 8.4
manager 8.4
student 8.1
history 8
computer 8
working 7.9
newspaper 7.9
day 7.8
middle aged 7.8
travel 7.7
customer 7.6
casual 7.6
mature 7.4
camera 7.4
new 7.3
success 7.2
aged 7.2
love 7.1
job 7.1

Google
created on 2022-01-22

Smile 94.5
Black 89.6
Shelf 85.1
Style 83.8
Black-and-white 83.2
Barware 80.5
Font 80.4
T-shirt 78.9
Eyewear 77.3
Retail 73.4
Display case 73
Monochrome photography 71.6
Shelving 70.1
Monochrome 68.9
Machine 62.9
Room 61
Advertising 59.6
Collection 58.6
Hat 55.6
Convenience store 54.6

Microsoft
created on 2022-01-22

text 99.7
indoor 90.5
person 87.9
man 80.4
black and white 80.1
clothing 75.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 39-47
Gender Male, 100%
Sad 39.7%
Calm 20.2%
Surprised 11%
Confused 8.5%
Angry 7.2%
Happy 6.4%
Disgusted 5.1%
Fear 1.9%

AWS Rekognition

Age 18-24
Gender Male, 69.4%
Calm 76.9%
Angry 8.4%
Fear 5.1%
Confused 3.6%
Sad 3.3%
Surprised 1.4%
Disgusted 0.8%
Happy 0.6%

AWS Rekognition

Age 24-34
Gender Male, 99.3%
Happy 95.7%
Surprised 1.9%
Calm 0.8%
Angry 0.5%
Fear 0.3%
Sad 0.3%
Disgusted 0.3%
Confused 0.2%

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

JEUELKY
YT33AS
English