Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3048

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3048

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Cafeteria 99.9
Indoors 99.9
Restaurant 99.9
Food 99.9
Meal 99.9
Architecture 99.9
Building 99.9
Dining Room 99.9
Dining Table 99.9
Furniture 99.9
Room 99.9
Table 99.9
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Dish 98.5
Adult 98.3
Male 98.3
Man 98.3
Person 98.3
Adult 98
Male 98
Man 98
Person 98
Adult 97.7
Male 97.7
Man 97.7
Person 97.7
People 97.1
Person 96
Baby 96
Face 91.1
Head 91.1
Cutlery 79.7
Spoon 79.7
Plate 69.7
Eating 62.7
Cup 59.4
Animal 57.4
Sea Life 57.4
Food Court 56.5
Dinner 55.7
Lunch 55.7
Seafood 55.3
Kitchen 55

Clarifai
created on 2018-05-10

people 99.8
group 98.8
group together 98.6
adult 98.4
man 97.4
war 94.7
four 93
several 92.3
military 91.1
recreation 91
woman 90.8
room 89.9
administration 89.8
five 89.4
furniture 89.2
sit 86.5
education 83.3
child 82.3
uniform 81.5
three 79.9

Imagga
created on 2023-10-06

man 41.3
people 35.7
office 34.6
male 34.1
sitting 33.5
adult 32.3
person 29.4
table 28.7
meeting 27.3
business 26.7
businessman 26.5
restaurant 24.8
professional 24.7
businesswoman 22.7
working 22.1
indoors 22
desk 21.9
computer 21.7
group 21
executive 20.7
corporate 20.6
smiling 20.3
happy 20.1
businesspeople 19.9
work 19.6
classroom 19.1
men 18.9
team 18.8
laptop 18.5
couple 17.4
suit 17.1
smile 17.1
job 16.8
together 16.7
indoor 16.4
room 16.4
workplace 16.2
teamwork 15.8
women 15
looking 14.4
home 14.4
senior 14.1
lifestyle 13.7
communication 13.4
holding 13.2
mature 13
teacher 12.6
drink 12.5
coffee 12.1
occupation 11.9
20s 11.9
paper 11.8
colleagues 11.7
adults 11.4
worker 11.2
glass 10.9
conference 10.8
scholar 10.5
talking 10.5
bow tie 10.2
two 10.2
dinner 10.1
beverage 10
face 10
employee 9.9
discussion 9.7
portrait 9.7
education 9.5
females 9.5
friends 9.4
presentation 9.3
hand 9.1
student 9.1
clothing 9.1
building 9
cheerful 8.9
technology 8.9
color 8.9
success 8.9
colleague 8.8
secretary 8.7
day 8.6
staff 8.6
casual 8.5
attractive 8.4
intellectual 8.4
document 8.4
alcohol 8.3
emotion 8.3
cup 8.1
necktie 8.1
food 7.9
coworkers 7.9
happiness 7.8
40s 7.8
books 7.7
daytime 7.7
30s 7.7
elderly 7.7
drinking 7.7
dining 7.6
college 7.6
career 7.6
meal 7.4
glasses 7.4
camera 7.4
friendly 7.3
alone 7.3
confident 7.3
handsome 7.1
love 7.1
look 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.7
indoor 98

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Male, 98%
Sad 93%
Calm 43.5%
Confused 7.1%
Surprised 6.5%
Fear 6.2%
Angry 1.8%
Happy 1.1%
Disgusted 0.5%

AWS Rekognition

Age 42-50
Gender Male, 99.9%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.9%
Confused 0.2%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 20-28
Gender Male, 99.2%
Calm 94.4%
Surprised 6.4%
Fear 6%
Sad 3.4%
Happy 1.1%
Confused 0.3%
Disgusted 0.1%
Angry 0.1%

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 95.8%
Surprised 6.5%
Fear 6%
Sad 2.3%
Confused 1.4%
Disgusted 0.9%
Angry 0.4%
Happy 0.2%

AWS Rekognition

Age 23-33
Gender Male, 98%
Calm 96.1%
Surprised 6.4%
Fear 6%
Sad 2.8%
Disgusted 1.1%
Angry 0.1%
Happy 0.1%
Confused 0.1%

Microsoft Cognitive Services

Age 54
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.6%
Male 98.6%
Man 98.6%
Person 98.6%
Baby 96%
Spoon 79.7%
Plate 69.7%
Cup 59.4%